Inserting tables with group and subgroup codes in SAS - sas

First please consider the following list of WC variable...
1101 1201 1301 1401 1501 1601
1102 1202 1302 1402 1502 1602
1103 1203 1303 1403 1503 1603
1104 1204 1304 1404 1504 1604
1105 1205 1305 1405 1505 1605
1106 1206 1306 1406 1506 1606
1107 1207 1307 1407 1507 1607
1108 1208 1308 1408 1508 1608
1109 1209 1309 1409 1509 1609
1110 1210 1310 1410 1510 1610
1111 1211 1311 1411 1511 1611
1112 1212 1312 1412 1512 1612
1113 1213 1313 1413 1513 1613
1114 1214 1314 1414 1514 1614
1115 1215 1315 1415 1515 1615
1116 1216 1316 1416 1516 1616
1117 1217 1317 1417 1517 1617
1118 1218 1318 1418 1518 1618
1119
1120
Now understand what I am trying to achieve.
I have a very long dataset where in i have two variables are shown in the below table...
WC ASN
1101 0
1101 1
1101 2
1101 3
1101 4
1101 20
1101 21
1101 22
1101 23
1101 24
1101 25
1101 26
1101 27
1101 28
1101 45
1101 46
1101 47
1101 48
1201 4
1201 5
1201 6
1201 7
1201 8
1201 16
1201 17
1201 18
1201 19
1201 20
1201 28
1201 29
1201 30
1201 31
1201 32
1201 41
1201 42
1201 43
1201 44
1202 4
1202 5
1202 6
1202 7
1202 8
1202 16
1202 17
1202 18
1202 19
1202 20
1202 29
1202 30
1202 31
1202 32
1202 40
1202 41
1202 42
1202 43
1202 44
What I want to do is to add two more columns Group and SubGroup such that I get the final table as shown below:
WC ASN Group SubGroup
1101 0 1 1
1101 1 1 1
1101 2 1 1
1101 3 1 1
1101 4 1 1
1101 20 1 1
1101 21 1 1
1101 22 1 1
1101 23 1 1
1101 24 1 1
1101 25 1 1
1101 26 1 1
1101 27 1 1
1101 28 1 1
1101 45 1 1
1101 46 1 1
1101 47 1 1
1101 48 1 1
1201 4 1 2
1201 5 1 2
1201 6 1 2
1201 7 1 2
1201 8 1 2
1201 16 1 2
1201 17 1 2
1201 18 1 2
1201 19 1 2
1201 20 1 2
1201 28 1 2
1201 29 1 2
1201 30 1 2
1201 31 1 2
1201 32 1 2
1201 41 1 2
1201 42 1 2
1201 43 1 2
1201 44 1 2
1301 8 1 3
1301 9 1 3
1301 10 1 3
1301 11 1 3
1301 12 1 3
1301 13 1 3
1301 14 1 3
1301 15 1 3
1301 16 1 3
1301 32 1 3
1301 33 1 3
1301 34 1 3
1301 35 1 3
1301 36 1 3
1301 37 1 3
1301 38 1 3
1301 39 1 3
1301 40 1 3
1401 8 1 4
1401 9 1 4
1401 10 1 4
1401 11 1 4
1401 12 1 4
1401 13 1 4
1401 14 1 4
1401 15 1 4
1401 16 1 4
1401 33 1 4
1401 34 1 4
1401 35 1 4
1401 36 1 4
1401 37 1 4
1401 38 1 4
1401 39 1 4
1401 40 1 4
1501 4 1 5
1501 5 1 5
1501 6 1 5
1501 7 1 5
1501 8 1 5
1501 16 1 5
1501 17 1 5
1501 18 1 5
1501 19 1 5
1501 20 1 5
1501 29 1 5
1501 30 1 5
1501 31 1 5
1501 32 1 5
1501 40 1 5
1501 41 1 5
1501 42 1 5
1501 43 1 5
1501 44 1 5
1601 0 1 6
1601 1 1 6
1601 2 1 6
1601 3 1 6
1601 4 1 6
1601 20 1 6
1601 21 1 6
1601 22 1 6
1601 23 1 6
1601 24 1 6
1601 25 1 6
1601 26 1 6
1601 27 1 6
1601 28 1 6
1601 44 1 6
1601 45 1 6
1601 46 1 6
1601 47 1 6
1601 48 1 6
I was trying something like this...
select;
when (WC = 1101) group = 1 subgroup=1;
when (WC = 1201) group = 1 subgroup=2;
when (WC = 1301) group = 1 subgroup=3;
when (WC = 1401) group = 1 subgroup=4;
when (WC = 1501) group = 1 subgroup=5;
when (WC = 1601) group = 1 subgroup=6;
when (WC = 1102) group = 2 subgroup=1;
when (WC = 1202) group = 2 subgroup=2;
when (WC = 1302) group = 2 subgroup=3;
when (WC = 1402) group = 2 subgroup=4;
.
.
.
when (WC = 1617) group = 18 subgroup=5;
when (WC = 1618) group = 18 subgroup=6;
when (WC = 1119) group = 1 subgroup=1;
otherwise group = 20 subgroup=1;
end;
This is seriously long, tedious and confusing. I am sure there are better and short ways of doing it.
Please help.
As Joe and Keith suggested, I am adding this explanation.
WC in the table as the appear..
WC1 WC2 WC3 WC4 WC5 WC6 Group
1101 1201 1301 1401 1501 1601 1
1102 1202 1302 1402 1502 1602 2
1103 1203 1303 1403 1503 1603 3
.
.
.
WC ASN Group SubGroup
1101 0 1 1 (because last two digits are 01 and 2nd degit is 1)
.
.
1206 1 6 2
.
.
1201 2 1 2
.
.
1213 3 13 2
.
.
1610 4 10 6

Seems like a case for substr to me.
group = substr(wc,1,1);
subgroup = substr(wc,2,1);
Or whatever your rule is. If your rule is more complicated than that, explain the rule.

Related

Point chart - two (or more) data rows

I would like to add the average Y-axis values for each X-axis value to a point chart. Is there any way to do this please? I would like to achieve a similar result to the second picture.
Expected result is here.
Data example
id
datum
year
month
day
weekday
hour
hourly_steps
cumulative_daily_steps
daily_steps
1
2021-01-01
2021
1
1
5
17
49
49
5837
2
2021-01-01
2021
1
1
5
18
4977
5026
5837
3
2021-01-01
2021
1
1
5
19
692
5718
5837
4
2021-01-01
2021
1
1
5
20
13
5731
5837
5
2021-01-01
2021
1
1
5
22
106
5837
5837
6
2021-01-02
2021
1
2
6
6
48
48
7965
7
2021-01-02
2021
1
2
6
9
97
145
7965
8
2021-01-02
2021
1
2
6
10
1109
1254
7965
9
2021-01-02
2021
1
2
6
11
253
1507
7965
10
2021-01-02
2021
1
2
6
12
126
1633
7965
11
2021-01-02
2021
1
2
6
13
51
1684
7965
12
2021-01-02
2021
1
2
6
14
690
2374
7965
13
2021-01-02
2021
1
2
6
15
3690
6064
7965
14
2021-01-02
2021
1
2
6
16
956
7020
7965
15
2021-01-02
2021
1
2
6
17
667
7687
7965
16
2021-01-02
2021
1
2
6
18
36
7723
7965
17
2021-01-02
2021
1
2
6
19
45
7768
7965
18
2021-01-02
2021
1
2
6
20
38
7806
7965
19
2021-01-02
2021
1
2
6
21
47
7853
7965
20
2021-01-02
2021
1
2
6
22
15
7868
7965
21
2021-01-02
2021
1
2
6
23
97
7965
7965
22
2021-01-03
2021
1
3
7
0
147
147
8007
23
2021-01-03
2021
1
3
7
7
15
162
8007
24
2021-01-03
2021
1
3
7
8
54
216
8007
25
2021-01-03
2021
1
3
7
9
47
263
8007
26
2021-01-03
2021
1
3
7
10
16
279
8007
27
2021-01-03
2021
1
3
7
11
16
295
8007
28
2021-01-03
2021
1
3
7
12
61
356
8007
29
2021-01-03
2021
1
3
7
13
1459
1815
8007
30
2021-01-03
2021
1
3
7
14
2869
4684
8007
31
2021-01-03
2021
1
3
7
15
2670
7354
8007
32
2021-01-03
2021
1
3
7
16
131
7485
8007
33
2021-01-03
2021
1
3
7
17
67
7552
8007
34
2021-01-03
2021
1
3
7
18
27
7579
8007
35
2021-01-03
2021
1
3
7
19
50
7629
8007
36
2021-01-03
2021
1
3
7
20
48
7677
8007
37
2021-01-03
2021
1
3
7
22
119
7796
8007
38
2021-01-03
2021
1
3
7
23
211
8007
8007
39
2021-01-04
2021
1
4
1
4
19
19
6022
40
2021-01-04
2021
1
4
1
6
94
113
6022
41
2021-01-04
2021
1
4
1
10
48
161
6022
42
2021-01-04
2021
1
4
1
11
97
258
6022
43
2021-01-04
2021
1
4
1
12
48
306
6022
44
2021-01-04
2021
1
4
1
13
39
345
6022
45
2021-01-04
2021
1
4
1
14
499
844
6022
46
2021-01-04
2021
1
4
1
15
799
1643
6022
47
2021-01-04
2021
1
4
1
16
180
1823
6022
48
2021-01-04
2021
1
4
1
17
55
1878
6022
49
2021-01-04
2021
1
4
1
18
27
1905
6022
50
2021-01-04
2021
1
4
1
19
2246
4151
6022
51
2021-01-04
2021
1
4
1
20
1518
5669
6022
52
2021-01-04
2021
1
4
1
21
247
5916
6022
53
2021-01-04
2021
1
4
1
22
106
6022
6022
54
2021-01-05
2021
1
5
2
4
18
18
7623
55
2021-01-05
2021
1
5
2
6
44
62
7623
56
2021-01-05
2021
1
5
2
7
51
113
7623
57
2021-01-05
2021
1
5
2
8
450
563
7623
58
2021-01-05
2021
1
5
2
9
385
948
7623
59
2021-01-05
2021
1
5
2
10
469
1417
7623
60
2021-01-05
2021
1
5
2
11
254
1671
7623
61
2021-01-05
2021
1
5
2
12
1014
2685
7623
62
2021-01-05
2021
1
5
2
13
415
3100
7623
63
2021-01-05
2021
1
5
2
14
297
3397
7623
64
2021-01-05
2021
1
5
2
15
31
3428
7623
65
2021-01-05
2021
1
5
2
17
50
3478
7623
66
2021-01-05
2021
1
5
2
18
3771
7249
7623
67
2021-01-05
2021
1
5
2
19
52
7301
7623
68
2021-01-05
2021
1
5
2
20
96
7397
7623
69
2021-01-05
2021
1
5
2
21
59
7456
7623
70
2021-01-05
2021
1
5
2
22
167
7623
7623
71
2021-01-06
2021
1
6
3
6
54
54
7916
72
2021-01-06
2021
1
6
3
7
1223
1277
7916
73
2021-01-06
2021
1
6
3
8
118
1395
7916
74
2021-01-06
2021
1
6
3
10
77
1472
7916
75
2021-01-06
2021
1
6
3
11
709
2181
7916
76
2021-01-06
2021
1
6
3
12
123
2304
7916
77
2021-01-06
2021
1
6
3
13
36
2340
7916
78
2021-01-06
2021
1
6
3
14
14
2354
7916
79
2021-01-06
2021
1
6
3
15
156
2510
7916
80
2021-01-06
2021
1
6
3
16
149
2659
7916
81
2021-01-06
2021
1
6
3
17
995
3654
7916
82
2021-01-06
2021
1
6
3
18
2022
5676
7916
83
2021-01-06
2021
1
6
3
19
34
5710
7916
84
2021-01-06
2021
1
6
3
21
937
6647
7916
85
2021-01-06
2021
1
6
3
22
1208
7855
7916
86
2021-01-06
2021
1
6
3
23
61
7916
7916
Here you go.
Data:
Add Deneb visual and then add the following fields ensuring that don't summarise is selected for each column.
Inside Deneb, paste the following spec.
{
"data": {"name": "dataset"},
"transform": [
{
"calculate": "datum['weekday ']<= 5?'weekday':'weekend'",
"as": "type"
}
],
"layer": [
{"mark": {"type": "point"}},
{
"mark": {"type": "line", "interpolate":"basis"},
"encoding": {
"x": {
"field": "hour"
},
"y": {
"aggregate": "mean",
"field": "cumulative_daily_steps"
}
}
}
],
"encoding": {
"x": {
"field": "hour",
"type": "quantitative",
"axis": {"title": "Hour of Day"}
},
"y": {
"field": "cumulative_daily_steps",
"type": "quantitative",
"axis": {
"title": "Cumulative Step Count"
}
},
"color": {
"field": "type",
"type": "nominal",
"scale": {
"range": ["red", "green"]
},
"legend": {"title": ""}
}
}
}

How to add a row where there is a disruption in series of numbers in Stata

I'm attempting to format a table of 40 different age-race-sex strata to be inputted into R-INLA and noticed that it's important to include all strata (even if they are not present in a county). These should be zeros. However, at this point my table only contains records for strata that are not empty. I can identify places where strata are missing for each county by looking at my strata variable and finding the breaks in the series 1 through 40 (marked with a red x in the image below).
In these places (marked by the red x) I need to add the missing rows and fill in the corresponding county code, strata code, population=0, and the correct corresponding race, sex, age code for the strata.
If I can figure out a way to add an empty row in the spaces with the red Xs from the image, and correctly assign the strata code (and county code) to these empty/missing rows, I am able to populate the rest of the values with the code below:
recode race = 1 & sex= 1 & age =4 if strata = 4
...etc
I'm wondering if there is a way to add the missing rows using an if statement that considers the fact that there are supposed to be forty strata for each county code. It would be ideal if this could populate the correct county code and strata code as well!
Dataex sample data:
* Example generated by -dataex-. To install: ssc install dataex
clear
input float OID str5 fips_statecounty double population byte(race sex age) float strata
1 "" 672 1 1 1 1
2 "" 1048 1 1 2 2
3 "" 883 1 1 3 3
4 "" 1129 1 1 4 4
5 "" 574 1 2 1 5
6 "" 986 1 2 2 6
7 "" 899 1 2 3 7
8 "" 1820 1 2 4 8
9 "" 96 2 1 1 9
10 "" 142 2 1 2 10
11 "" 81 2 1 3 11
12 "" 99 2 1 4 12
13 "" 71 2 2 1 13
14 "" 125 2 2 2 14
15 "" 103 2 2 3 15
16 "" 162 2 2 4 16
17 "" 31 3 1 1 17
18 "" 32 3 1 2 18
19 "" 18 3 1 3 19
20 "" 31 3 1 4 20
21 "" 22 3 2 1 21
22 "" 28 3 2 2 22
23 "" 28 3 2 3 23
24 "" 44 3 2 4 24
25 "" 20 4 1 1 25
26 "" 24 4 1 2 26
27 "" 21 4 1 3 27
28 "" 43 4 1 4 28
29 "" 19 4 2 1 29
30 "" 26 4 2 2 30
31 "" 24 4 2 3 31
32 "" 58 4 2 4 32
33 "" 6 5 1 1 33
34 "" 11 5 1 2 34
35 "" 13 5 1 3 35
36 "" 7 5 1 4 36
37 "" 7 5 2 1 37
38 "" 9 5 2 2 38
39 "" 10 5 2 3 39
40 "" 11 5 2 4 40
41 "01001" 239 1 1 1 1
42 "01001" 464 1 1 2 2
43 "01001" 314 1 1 3 3
44 "01001" 232 1 1 4 4
45 "01001" 284 1 2 1 5
46 "01001" 580 1 2 2 6
47 "01001" 392 1 2 3 7
48 "01001" 440 1 2 4 8
49 "01001" 41 2 1 1 9
50 "01001" 38 2 1 2 10
51 "01001" 23 2 1 3 11
52 "01001" 26 2 1 4 12
53 "01001" 34 2 2 1 13
54 "01001" 52 2 2 2 14
55 "01001" 40 2 2 3 15
56 "01001" 50 2 2 4 16
57 "01001" 4 3 1 1 17
58 "01001" 2 3 1 2 18
59 "01001" 3 3 1 3 19
60 "01001" 6 3 2 1 21
61 "01001" 4 3 2 2 22
62 "01001" 6 3 2 3 23
63 "01001" 4 3 2 4 24
64 "01001" 1 4 1 4 28
65 "01003" 1424 1 1 1 1
66 "01003" 2415 1 1 2 2
67 "01003" 1680 1 1 3 3
68 "01003" 1823 1 1 4 4
69 "01003" 1545 1 2 1 5
70 "01003" 2592 1 2 2 6
71 "01003" 1916 1 2 3 7
72 "01003" 2527 1 2 4 8
73 "01003" 68 2 1 1 9
74 "01003" 82 2 1 2 10
75 "01003" 52 2 1 3 11
76 "01003" 54 2 1 4 12
77 "01003" 72 2 2 1 13
78 "01003" 129 2 2 2 14
79 "01003" 81 2 2 3 15
80 "01003" 106 2 2 4 16
81 "01003" 10 3 1 1 17
82 "01003" 14 3 1 2 18
83 "01003" 8 3 1 3 19
84 "01003" 4 3 1 4 20
85 "01003" 8 3 2 1 21
86 "01003" 14 3 2 2 22
87 "01003" 17 3 2 3 23
88 "01003" 10 3 2 4 24
89 "01003" 4 4 1 1 25
90 "01003" 1 4 1 3 27
91 "01003" 2 4 1 4 28
92 "01003" 2 4 2 1 29
93 "01003" 3 4 2 2 30
94 "01003" 4 4 2 3 31
95 "01003" 10 4 2 4 32
96 "01003" 5 5 1 1 33
97 "01003" 4 5 1 2 34
98 "01003" 3 5 1 3 35
99 "01003" 5 5 1 4 36
100 "01003" 5 5 2 2 38
end
label values race race
label values sex sex
My answer to your previous question
Nested for-loop: error variable already defined
detailed how to create a minimal dataset with all strata present. Therefore you should just merge that with your main dataset and replace missings on the absent strata with whatever your other software expects, zeros it seems.
The complication most obvious at this point is you need to factor in a county variable. I can't see any information on how many counties you have in your dataset, which may affect what is practical. You should be able to break down the preparation into: first, prepare a minimal county dataset with identifiers only; then merge that with a complete strata dataset.

Subtract Set Value at Aggregated Level

Values are for two groups by quarter.
In DAX, need to summarize all the data but also need to remove -5 from each quarter (-20 for full year) in 2021 for Group 1, without allowing the value to go below 0.
This only impacts:
Group 1
2021
However, I also need to retain the data details without the adjustment. So I can't do this in Power Query.
Data:
Group
Date
Value
1
01/01/2020
10
1
02/01/2020
9
1
03/01/2020
10
1
04/01/2020
8
1
05/01/2020
10
1
06/01/2020
11
1
07/01/2020
18
1
08/01/2020
2
1
09/01/2020
1
1
10/01/2020
0
1
11/01/2020
1
1
12/01/2020
0
1
01/01/2021
1
1
02/01/2021
12
1
03/01/2021
12
1
04/01/2021
3
1
05/01/2021
13
1
06/01/2021
14
1
07/01/2021
7
1
08/01/2021
1
1
09/01/2021
0
1
10/01/2021
1
1
11/01/2021
2
1
12/01/2021
1
2
01/01/2020
18
2
02/01/2020
7
2
03/01/2020
6
2
04/01/2020
8
2
05/01/2020
12
2
06/01/2020
13
2
07/01/2020
14
2
08/01/2020
8
2
09/01/2020
7
2
10/01/2020
6
2
11/01/2020
5
2
12/01/2020
4
2
01/01/2021
12
2
02/01/2021
18
2
03/01/2021
19
2
04/01/2021
20
2
05/01/2021
12
2
06/01/2021
12
2
07/01/2021
7
2
08/01/2021
18
2
09/01/2021
16
2
10/01/2021
15
2
11/01/2021
13
2
12/01/2021
1
Result:
Qtr/Year
Group 1 Value
Group 2 Value
Total
Q1-2020
29
31
60
Q2-2020
29
33
62
Q3-2020
21
29
50
Q4-2020
1
15
16
2020
80
108
188
Q1-2021
20
49
69
Q2-2021
25
44
69
Q3-2021
3
41
44
Q4-2021
0
29
29
2021
48
271
211
I'd suggest summarizing at the Year/Quarter/Group granularity and summing that up as follows:
SumValue =
VAR Summary =
SUMMARIZE (
Table2,
Table2[Year],
Table2[Qtr],
Table2[Group],
"#RawValue", SUM ( Table2[Value] ),
"#RemoveValue", IF ( Table2[Year] = 2021 && Table2[Group] = 1, 5 )
)
RETURN
SUMX ( Summary, MAX ( [#RawValue] - [#RemoveValue], 0 ) )
(This assumes the amount to remove for a year is the same as for four quarters.)

SQL Update Subsequent Column OFFSET FETCH NEXT

I like to know is there a way to doing auto looping / counter batch, updating SQL column like using OFFSET / FETCH NEXT method
QUESTION : Below table have 20 rows, I like to update DealerId column the First 4 rows as 1,2,3,4 and the next subsequent 4 rows repeating as 1,2,3,4 values
Something like below
NEED TO MODIFY TABLE
Id DealerId
1 1 1
2 2 2
3 3 3
4 4 4
5 5 1
6 6 2
7 7 3
8 8 4
9 9 1
10 10 2
11 11 3
12 12 4
13 13 1
14 14 2
15 15 3
16 16 4
17 17 1
18 18 2
19 19 3
20 20 4
ORIGINAL TABLE
Id DealerId StoreId TerminalId MessageNo CreatedDate
1 1 86 5027 029500021201403031434350039 2014-03-03 14:34:37.347
2 2 86 5027 029500021201403031434350039 2014-03-05 10:31:59.903
3 3 86 5027 029500021201403031434350039 2014-03-05 10:33:41.293
4 4 86 5027 029500021201403031434350039 2014-03-05 10:46:50.057
5 5 86 5027 029500021201403031434350039 2014-03-05 10:50:23.910
6 6 33 5338 004000003201403051508010255 2014-03-05 15:08:03.247
7 7 26 5595 704201181201403061024330013 2014-03-06 10:24:34.590
8 8 26 5595 704201181201403061026180022 2014-03-06 10:26:19.517
9 9 33 5338 004000003201403061043150312 2014-03-06 10:43:16.013
10 10 86 5027 029500021201403031434350039 2014-03-06 14:27:51.717
11 11 86 5027 029500021201403031434350039 2014-03-06 14:38:40.593
12 12 86 5027 029500021201403031434350039 2014-03-06 14:44:25.947
13 13 521 4905 051100003002447 2014-03-07 12:51:07.487
14 14 521 4905 051100003002447 2014-03-07 12:55:07.300
15 15 521 4905 051100003002447 2014-03-07 12:56:24.793
16 16 521 4905 051100003002447 2014-03-07 12:57:43.123
17 17 521 4905 051100003002447 2014-03-07 14:15:11.093
18 18 632 5120 088800003201403071441280026 2014-03-07 14:41:29.733
19 19 632 5120 088800003201403071456500050 2014-03-07 14:56:51.727
20 20 632 5120 088800003201403071459240064 2014-03-07 14:59:24.953
Assuming that all id's are consequently, starting from 1:
In MySQL:
update OriginalTable
set DealreId = mod(id-1, 4)+1
and in Microsoft SQL Server:
update OriginalTable
set DealreId = ((id-1)%4)+1
And if the id's are not consequently (or are not starting from 1) you can use cursor to update it one by one:
DECLARE c1 CURSOR FOR
SELECT id, dealerId
FROM OriginalTable
ORDER BY id, dealerId
OPEN c1
declare #id int
declare #dealerId int
declare #i int
set #i = 1
FETCH NEXT FROM c1
INTO #id, #dealerId
while ##FETCH_STATUS = 0
BEGIN
update OriginalTable
set dealerId = #i
where current of c1
if (#i < 4)
set #i = #i + 1
else
set #i = 1
FETCH NEXT FROM c1
INTO #id, #dealerId
END

Regular Expression complex serie with specific pattern

I will try to explain what I need help with.
The numbers in below series that I want to check by the regex, are "2 904", "3 231", "2 653", "2 653", "2 353" and so on. My goal is to only get a match if one of these numbers will be in format "123" (3-digits, between 100-999)
No match:
sö 31 1 2 904 2 3 231 3 2 653 32 4 2 653 5 2 353 6 2 353 7 2 353 8 2 904 9 3 002 10 3 143 33 11 2 615 12 2 353 13 2 353 14 2 353 15 2 353 16 2 653 17 2 353 34 18 2 157 19 1 699 20 1 699
Match:
sö 31 1 2 904 2 3 231 3 653 32 4 2 653 5 2 353 6 2 353 7 2 353 8 2 904 9 3 002 10 3 143 33 11 2 615 12 2 353 13 2 353 14 2 353 15 2 353 16 2 653 17 2 353 34 18 2 157 19 1 699 20 1 699
sö 31 1 2 904 2 3 231 3 2 653 32 4 2 653 5 2 353 6 2 353 7 2 353 8 2 904 9 3 002 10 3 143 33 11 2 615 12 2 353 13 2 353 14 953 15 2 353 16 2 653 17 2 353 34 18 2 157 19 1 699 20 1 699
As you can see from my examples, number "2 653" changed to "653" just after number "3"
And number " 2 353" changed to "953" after number "14".
Numbers between, i.e 1-20, are static numbers and will never change
Possible?
I will try it then at http://rubular.com/