I have the following data set structure:
date time_in_hours price
Sep 03 08 9.76 98
Sep 03 08 10.43 97
Sep 03 08 10.98 96
Sep 03 08 11.48 99
Sep 04 08 2.35 98
Sep 04 08 2.58 98.45
Sep 04 08 3.45 96.3
Sep 04 08 3.89 96.25
Sep 04 08 4.18 100
Sep 05 08 12.65 101
Sep 05 08 12.96 100.25
Sep 05 08 13.25 104.35
Sep 05 08 13.78 98
My data is for the years 2008 and 2009. It contains a total of 504 trading days.
My objective is to interpolate prices at every half hour (eg 9.5 10 10.5 11 11.5...etc.) only for time interval between 9.5 and 16.
I have been struggling with the command interpolate / aggregate given that I must interpolate for a specific time interval for each calendar date. My final output must also contain the date, time, and price. Something like this:
date time_in_hours price
Sep 03 08 10 98
Sep 03 08 10.5 97
Sep 03 08 11 96
Sep 03 08 11.5 99
Sep 04 08 2.5 98
Sep 04 08 3 98.45
The code below gives you the output you have stated, but based on all comments above I am not sure it is going to solve your problem completely. Note that round(x, 0.5) takes 0.25 as the boundary - so 2.74 becomes 2.5 while 2.75 becomes 3.
data test;
infile datalines dsd;
input date :$20. time_in_hours price;
datalines;
Sep 03 08,9.76,98
Sep 03 08,10.43,97
Sep 0308,10.98,96
Sep 03 08,11.48,99
Sep 04 08,2.35,98
Sep 04 08,2.58,98.45
Sep 04 08,3.45,96.3
Sep 04 08,3.89,96.25
Sep 04 08,4.18,100
Sep 05 08,12.65,101
Sep 05 08,12.96,100.25
Sep 05 08,13.25,104.35
Sep 05 08,13.78,98
;
run;
proc print;
run;
data test2;
set test(rename = (time_in_hours = old_time_in_hours));
time_in_hours = round(old_time_in_hours, 0.5);
if (9.5 <= time_in_hours <= 16);
run;
proc print;
run;
Related
I've a file that contains information about the programs.
What I want is to get some information about a particular prgoram.
This the structure of the file.
sometext...program.EXE;Thu, 04 May 2017 08:58:48 -0700;Wed, 27 Sep 2017 10:50:00 -0700;Wed, 04 Oct 2017 00:00:31 -0700;True;False, 17:38:05.810;30...somtext
I was to get the following detail from the above file. each field is separated with ;
p = program.exe
dt1 = Thu, 04 May 2017 08:58:48 -0700
dt2 = Wed, 27 Sep 2017 10:50:00 -0700
dt3 = Wed, 04 Oct 2017 00:00:31 -0700
d1 = True
d2 = False
Get-Content .\file.txt
So far I have \W*((?i)program.exe(?-i))\W* to match it.
But I don't know how to move forward, read all of the fields and parse it.
>> sometext...program.EXE;Thu, 04 May 2017 08:58:48 -0700;Wed, 27 Sep 2017 10:50:00 -0700;Wed, 04 Oct 2017 00:00:31 -0700;True;False, 17:38:05.810;30...somtext
>> #'|
>> Select-String '(?i)\W*(program\.exe)\W*(.*?;)(.*?;)(.*?;)(.*?;)(.*?;)'|
>> % {$_.Matches}|
>> % {$p=$_.Groups[1].Value;$dt1=$_.Groups[2].Value;$dt2=$_.Groups[3].Value;$dt3=$_.Groups[4].Value;$d1=$_.Groups[5].Value;$d2=$_.Groups[6].Value}
:\> $p
program.EXE
:\> $dt1
Thu, 04 May 2017 08:58:48 -0700;
:\> $dt2
Wed, 27 Sep 2017 10:50:00 -0700;
:\> $dt3
Wed, 04 Oct 2017 00:00:31 -0700;
:\> $d1
True;
:\> $d2
False, 17:38:05.810;
:\>
OR
IN: \> "sometext...program.EXE;Thu, 04 May 2017 08:58:48 -0700;Wed, 27 Sep 2017 10:50:00 -0700;Wed, 04 Oct 2017 00:00:31 -0700;True;False, 17:38:05.810;30..
.somtext"|
IN: >> Select-String '(?i)\W*(program\.exe)\W*(.*?;)(.*?;)(.*?;)(.*?;)(.*?;)' -OutVariable o
OUT:
sometext...program.EXE;Thu, 04 May 2017 08:58:48 -0700;Wed, 27 Sep 2017 10:50:00 -0700;Wed, 04 Oct 2017 00:00:31 -0700;True;False, 17:38:05.810;30...somtext
IN: \> $f,$p,$dt1,$dt2,$dt3,$d1,$d2=% -inputObject $o.Matches.Groups {$_.Value}
IN: \> $d2
OUT: False, 17:38:05.810;
I assigned each group to the variable required. See if this works. Apologize for any naivety, I'm not well versed with powershell.
Next try
$p,$dt1,$dt2,$dt3,$d1,$d2=#'
sometext...program.EXE;Thu, 04 May 2017 08:58:48 -0700;Wed, 27 Sep 2017 10:50:00 -0700;Wed, 04 Oct 2017 00:00:31 -0700;True;False, 17:38:05.810;30...somtext
'#|
Select-String '(program.exe)[^;]*(?:;([^;]+)){3}(?:;(true|false)){2},' -AllMatches|
ForEach-Object {$_.Matches}|
ForEach-Object {$_.Groups[1..3]}|
ForEach-Object {$_.Captures}|
Select-Object -ExpandProperty Value
$p,$dt1,$dt2,$dt3,$d1,$d2
Can this help you?
#'
sometext...program.EXE;Thu, 04 May 2017 08:58:48 -0700;Wed, 27 Sep 2017 10:50:00 -0700;Wed, 04 Oct 2017 00:00:31 -0700;True;False, 17:38:05.810;30...somtext
'#|
Select-String '([^;]+)' -AllMatches|
ForEach-Object {$_.Matches}|
ForEach-Object {$_.Groups[1].Value}
I have following pandas Dataframe:
ID Year Jan_salary Jan_days Feb_salary Feb_days Mar_salary Mar_days
1 2016 4500 22 4200 18 4700 24
2 2016 3800 23 3600 19 4400 23
3 2016 5500 21 5200 17 5300 23
I want to convert this dataframe to following dataframe:
ID Year month salary days
1 2016 01 4500 22
1 2016 02 4200 18
1 2016 03 4700 24
2 2016 01 3800 23
2 2016 02 3600 19
2 2016 03 4400 23
3 2016 01 5500 21
3 2016 02 5200 17
3 2016 03 5300 23
I tried use pandas.DataFrame.stack but couldn't get the expected outcome.
I am using Python 2.7
Please guide me to reshape this Pandas dataframe.
Thanks.
df = df.set_index(['ID', 'Year'])
df.columns = df.columns.str.split('_', expand=True).rename('month', level=0)
df = df.stack(0).reset_index()
md = dict(Jan='01', Feb='02', Mar='03')
df.month = df.month.map(md)
df[['ID', 'Year', 'month', 'salary', 'days']]
I love pd.melt so that's what I used in this long-winded approach:
ldf = pd.melt(df,id_vars=['ID','Year'],
value_vars=['Jan_salary','Feb_salary','Mar_salary'],
var_name='month',value_name='salary')
rdf = pd.melt(df,id_vars=['ID','Year'],
value_vars=['Jan_days','Feb_days','Mar_days'],
value_name='days')
rdf.drop(['ID','Year','variable'],inplace=True,axis=1)
cdf = pd.concat([ldf,rdf],axis=1)
cdf['month'] = cdf['month'].str.replace('_salary','')
import calendar
def mapper(month_abbr):
# from http://stackoverflow.com/a/3418092/42346
d = {v: str(k).zfill(2) for k,v in enumerate(calendar.month_abbr)}
return d[month_abbr]
cdf['month'] = cdf['month'].apply(mapper)
Result:
>>> cdf
ID Year month salary days
0 1 2016 01 4500 22
1 2 2016 01 3800 23
2 3 2016 01 5500 21
3 1 2016 02 4200 18
4 2 2016 02 3600 19
5 3 2016 02 5200 17
6 1 2016 03 4700 24
7 2 2016 03 4400 23
8 3 2016 03 5300 23
The simple workflow for deploying/invoking a chaincode (to my knowledge) is :
Deploy a chaincode(smart contract) on the blockchain
This brings up a docker container on all peers that has the chaincode running in it
Invoke some function
This type of function changes the values of variables in chaincode state
For asset_management.go, the chaincode can be tested by running go test in the asset_management chaincode directory . But this does not really bring up a docker container(or does it ?) that runs the asset_management chaincode.
Whats the right way to deploy/invoke this chaincode and how is it different from deploying/invoking chaincodes using the REST interface(like we do for chaincode_example02)
The chaincode workflow you mentioned is correct, just one detail regarding variables in chaincode state: the variables are stored in a global key-value collection named World State, which is accessed through the invocation of a chaincode and it is access protected.
Now, what you are doing with go test is running the code in asset_management_test.go. If you look at this code, you will see that it basically starts a VP and a CA and then tries sending transactions to tests that the chaincode works. For example:
// Now create the Transactions message and send to Peer.
transaction, err := txHandler.NewChaincodeExecute(chaincodeInvocationSpec, tid)
You could also code a test file for the chaincode_example02 and test it.
Or you can also deploy the asset_management chaincode the same way you use to deploy chaincode_example02. Which can be using a chaincode development environment or a development network.
Important: asset_management chaincode is used to test the invocation access control, so it is fairly complex. Invoking its methods means using digital signatures to check the identity of the chaincode invoker. You can check the asset_management_test file to see how it is done.
The list of steps for anybody who would like to run “asset_management_with_roles” manually:
Checkout Fabric, run vagrant from “devenv” folder
ssh to the started container.
Reset Fabric’s configuration:
rm /var/hyperledger/production
Enable attribute certificate authority in membersrvc.yaml
aca.enabled: true
Enable security in core.yaml
security.enable: true
Switch log level for “node” to “debug” in core.yaml (optional. not necessary if you know the certificates)
logging.node: debug
Run membersrvc in background:
nohup membersrvc &> /tmp/membersrvc.log &
Run peer service
peer node start
Verify if users “assigner, bob, alice” are in membersrvc.yaml, according to the comment in this example we will work with:
// This example implements asset transfer using attributes support and specifically Attribute Based Access Control (ABAC).
// There are three users in this example:
// - alice
// - bob
// - assigner
//
// This users are defined in the section “eca" of asset.yaml file.
// In the section “aca" of asset.yaml file two attributes are defined to this users:
// The first attribute is called ‘role' with this values:
// - alice has role = client
// - bob has role = client
// - assigner has role = assigner
//
// The second attribute is called ‘account' with this values:
// - alice has account = 12345-56789
// - bob has account = 23456-67890
Open another ssh terminal with vagrant and login to the network:
peer network login assigner -p Tc43PeqBl11
peer network login bob -p NOE63pEQbL25
peer network login alice -p CMS10pEQlB16
8. Deploy chaincode to the network using “assigner” security context:
curl -XPOST -d ‘{“jsonrpc": "2.0", "method": "deploy", "params": {"type": 1,"chaincodeID": {"path": "github.com/hyperledger/fabric/examples/chaincode/go/asset_management_with_roles","language": "GOLANG"}, "ctorMsg": { "args": ["init"] }, "metadata":[97, 115, 115, 105, 103, 110, 101, 114] ,"secureContext": "assigner"} ,"id": 0}' http://localhost:7050/chaincode
metadata contains utf-8 encoded string “assigner”. This string will be saved in a ledger and only user with such role will be able to execute “assign” function in smart contract.
In order to keep example readable lets save chaincode id in local variable:
export HASH=7adc030881c07c39d2edac0b1560bf7cf2b7f0a4bce74fe7e6144e3f36e1bf2d176093d4c23ba58712a9589d9600e6d9ef596a1521a4c5227c222d8af2bf16c8
Starting from this moment user “assigner” can create new assets for bob and alice, we just have to find their certificates.
let’s run query command for any random asset name under “bob” securityContext:
curl -XPOST -d '{"jsonrpc": "2.0", "method": "query", "params": {"type": 1, "chaincodeID": {"name": "'"$HASH"'"}, "ctorMsg": {"args": ["query", "myasset"]}, "secureContext": "bob", "attributes": ["role", "account"]}, "id": 1}' http://localhost:7050/chaincode
(IMPORTANT: without “attributes”: [“role”, “account”] no attributes will be loaded into transactions certificate)
As far as “peer” is started in debug mode, bob’s certificate will be printed in peer log output. Try to find row “[client.bob] Adding new Cert” and copy certificate value:
30 82 02 90 30 82 02 37 a0 03 02 01 02 02 10 2f 9e 4e da c9 e9 4e 97 b1 58 24 78 4e 15 05 f4 30 0a 06 08 2a 86 48 ce 3d 04 03 03 30 31 31 0b 30 09 06 03 55 04 06 13 02 55 53 31 14 30 12 06 03 55 04 0a 13 0b 48 79 70 65 72 6c 65 64 67 65 72 31 0c 30 0a 06 03 55 04 03 13 03 74 63 61 30 1e 17 0d 31 36 30 39 31 39 32 31 32 34 31 39 5a 17 0d 31 36 31 32 31 38 32 31 32 34 31 39 5a 30 45 31 0b 30 09 06 03 55 04 06 13 02 55 53 31 14 30 12 06 03 55 04 0a 13 0b 48 79 70 65 72 6c 65 64 67 65 72 31 20 30 1e 06 03 55 04 03 13 17 54 72 61 6e 73 61 63 74 69 6f 6e 20 43 65 72 74 69 66 69 63 61 74 65 30 59 30 13 06 07 2a 86 48 ce 3d 02 01 06 08 2a 86 48 ce 3d 03 01 07 03 42 00 04 78 8f f2 11 55 a3 5a 8d f1 b5 4f 38 e4 94 e4 67 b0 47 7f e0 07 04 b8 fb 12 ee 86 17 8a 05 55 e3 98 f6 c1 af 59 ee 2d 54 a9 c5 36 22 cd fa a8 1b ce ba e0 26 fd 73 40 af 20 5d 15 65 89 9c 62 64 a3 82 01 1b 30 82 01 17 30 0e 06 03 55 1d 0f 01 01 ff 04 04 03 02 07 80 30 0c 06 03 55 1d 13 01 01 ff 04 02 30 00 30 0d 06 03 55 1d 0e 04 06 04 04 01 02 03 04 30 0f 06 03 55 1d 23 04 08 30 06 80 04 01 02 03 04 30 10 06 06 2a 03 04 05 06 0a 04 06 63 6c 69 65 6e 74 30 15 06 06 2a 03 04 05 06 0b 04 0b 32 33 34 35 36 2d 36 37 38 39 30 30 4d 06 06 2a 03 04 05 06 07 01 01 ff 04 40 fc c2 07 dd ee ac 8c 76 84 12 07 d2 e0 a6 da b3 06 c9 5b 5b 41 57 a3 f3 a2 f7 59 e2 ed 02 02 7e 56 46 f5 bc 24 00 0a 2e 18 b4 a6 b7 a6 c3 8d ca 15 13 a7 98 42 98 8f 9b 85 a2 d1 6a 77 0d da e8 30 3a 06 06 2a 03 04 05 06 08 04 30 ff d2 ab 7f c8 2d 98 c4 3f c9 f7 05 12 07 01 3a 36 69 f8 ee d1 c4 27 16 48 3e ee ed db b9 b6 3c d6 e5 1a 3e 0b 7d f0 19 1c 81 03 12 f6 7b d5 3e 30 23 06 06 2a 03 04 05 06 09 04 19 30 30 48 45 41 44 72 6f 6c 65 2d 3e 31 23 61 63 63 6f 75 6e 74 2d 3e 32 23 30 0a 06 08 2a 86 48 ce 3d 04 03 03 03 47 00 30 44 02 20 49 52 26 bd b8 f4 a0 98 c6 ff fc 56 3e b5 b0 12 ee ec b7 46 90 55 b1 17 99 29 fe df 80 2e 95 b9 02 20 3b 7f dd 32 88 56 ae a1 14 60 54 60 95 61 fb d1 bc 0c f7 e0 61 f2 e9 0b 46 35 6a 36 61 c9 b8 f0
Certificate should be based64 encoded. As an option we can use http://tomeko.net/online_tools/hex_to_base64.php?lang=en
Insert certificate into “Hex string” field, click “convert” button, and result will be in “Output (base64)”:
MIICkjCCAjigAwIBAgIRAO9nis6q+khvv6TMvhKbmacwCgYIKoZIzj0EAwMwMTELMAkGA1UEBhMCVVMxFDASBgNVBAoTC0h5cGVybGVkZ2VyMQwwCgYDVQQDEwN0Y2EwHhcNMTYwOTE5MjAyMDE5WhcNMTYxMjE4MjAyMDE5WjBFMQswCQYDVQQGEwJVUzEUMBIGA1UEChMLSHlwZXJsZWRnZXIxIDAeBgNVBAMTF1RyYW5zYWN0aW9uIENlcnRpZmljYXRlMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEqop3N0IpJaLVaRuYioSuHPvyWX3OY9vo4I1YYw1YophcFGFt3fN0X6bDlufUZ5/u81JMmZHozduREnNzM1n+gaOCARswggEXMA4GA1UdDwEB/wQEAwIHgDAMBgNVHRMBAf8EAjAAMA0GA1UdDgQGBAQBAgMEMA8GA1UdIwQIMAaABAECAwQwEAYGKgMEBQYKBAZjbGllbnQwFQYGKgMEBQYLBAsyMzQ1Ni02Nzg5MDBNBgYqAwQFBgcBAf8EQNbPDmdWcOogMkZrlxbRJw/06jg4Ai88KW2+BsuxUnIH5FSa3OY7ZsXJLpceIN4SeEWKDKDsIPCo2wm6cUMYApIwOgYGKgMEBQYIBDDikSBKFYtTmYZRhtVDPhnIoSvefWHQ5Vx5oahIRbG8d/w4J1YTrtVoEwa2jikAqJowIwYGKgMEBQYJBBkwMEhFQURyb2xlLT4xI2FjY291bnQtPjIjMAoGCCqGSM49BAMDA0gAMEUCIQCrUQw2moOA5RFEx/780so4uEOV5esX3fy/It0t2la7gQIgGGVoDoM2kSxWH7TtV4T8W4pY6tN/LXu8XpKWb8+eF0k=
"assign" method expects 2 parameters the Name for asset and owner certificate. New asset can be created using:
curl -XPOST -d '{"jsonrpc": "2.0", "method": "invoke", "params": {"type": 1, "chaincodeID": {"name": "'"$HASH"'"}, "ctorMsg": {"args": ["assign", "myasset", "MIICkjCCAjigAwIBAgIRAO9nis6q+khvv6TMvhKbmacwCgYIKoZIzj0EAwMwMTELMAkGA1UEBhMCVVMxFDASBgNVBAoTC0h5cGVybGVkZ2VyMQwwCgYDVQQDEwN0Y2EwHhcNMTYwOTE5MjAyMDE5WhcNMTYxMjE4MjAyMDE5WjBFMQswCQYDVQQGEwJVUzEUMBIGA1UEChMLSHlwZXJsZWRnZXIxIDAeBgNVBAMTF1RyYW5zYWN0aW9uIENlcnRpZmljYXRlMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEqop3N0IpJaLVaRuYioSuHPvyWX3OY9vo4I1YYw1YophcFGFt3fN0X6bDlufUZ5/u81JMmZHozduREnNzM1n+gaOCARswggEXMA4GA1UdDwEB/wQEAwIHgDAMBgNVHRMBAf8EAjAAMA0GA1UdDgQGBAQBAgMEMA8GA1UdIwQIMAaABAECAwQwEAYGKgMEBQYKBAZjbGllbnQwFQYGKgMEBQYLBAsyMzQ1Ni02Nzg5MDBNBgYqAwQFBgcBAf8EQNbPDmdWcOogMkZrlxbRJw/06jg4Ai88KW2+BsuxUnIH5FSa3OY7ZsXJLpceIN4SeEWKDKDsIPCo2wm6cUMYApIwOgYGKgMEBQYIBDDikSBKFYtTmYZRhtVDPhnIoSvefWHQ5Vx5oahIRbG8d/w4J1YTrtVoEwa2jikAqJowIwYGKgMEBQYJBBkwMEhFQURyb2xlLT4xI2FjY291bnQtPjIjMAoGCCqGSM49BAMDA0gAMEUCIQCrUQw2moOA5RFEx/780so4uEOV5esX3fy/It0t2la7gQIgGGVoDoM2kSxWH7TtV4T8W4pY6tN/LXu8XpKWb8+eF0k="]}, "metadata":[97, 115, 115, 105, 103, 110, 101, 114], "secureContext": "assigner", "attributes": ["role", "account"]}, "id": 1}' http://localhost:7050/chaincode
Try to run the query from step 9 for bob again:
curl -XPOST -d ‘{"jsonrpc": "2.0", "method": "query", "params": {"type": 1, "chaincodeID": {"name": "'"$HASH"'"}, "ctorMsg": {"args": ["query", "myasset"]}, "secureContext": "bob", "attributes": ["role", "account"]}, "id": 1}' http://localhost:7050/chaincode
and you will see that “myasset” is already created, and belongs to account “23456-67890”
Using the same approach we can find certificate for alice and change the owner for “myasset”.
I consulted as I can configure SPOON for correct import of data, knowing that I have the data delimited by spaces.
And if it affects the import process in the penultimate record "SP_SEC" is not always a record and may be blank, affect the import?
I show the data as I have:
SP_NLE SP_LIB SP_DEP SP_PRV SP_DST SP_APP SP_APM SP_NOM SP_NAC SP_SEX SP_GRI SP_SEC SP_DOC
00000001 000090 70 03 04 BARDALES AHUANARI RENE 19111116 2 10 8
00000003 000001 25 01 01 MEZA DE RUIZ CARLOTA 19400119 2 20 1 1
00000004 000001 25 01 01 BARDALES TORRES JOYCE 19580122 2 20 9 1
00000005 244246 25 01 02 RAMIREZ RUIZ FRANCISCO 19600309 1 20 7 1
00000006 000001 25 01 01 SILVA RIVERA DE RIOS ALICIA 19570310 2 20 5 1
00000008 000001 25 01 01 PACAYA MANIHUARI MANUEL 19401215 1 10 1 1
00000009 233405 25 01 02 TORRES MUÑOZ GLADYS 19650902 2 20 0 1
00000010 000508 25 01 01 OLIVOS RODAS BRITALDO 19510924 1 20 3 1
00000011 000001 25 01 01 ESCUDERO HERNANDEZ JULIA ISABEL 19351118 2 30 1
00000012 000001 25 01 01 YAICATE TARICUARIMA RICARDO 19560118 1 20 0 1
00000013 000001 25 01 01 ESPINOZA DE PINEDO ALEGRIA 19371108 2 10 1
00000014 000001 25 01 01 GARCIA PINCHI RICARDO 19650315 1 30 6 1
00000015 236352 09 01 01 LAO ESPINOZA ALINA 19601217 2 30 4 1
00000017 219532 25 01 01 YAICATE YAHUARCANI OLGA 19530706 2 10 1 1
Please aid, which must be placed in the section "Regular Expression" and the Content tab should be placed in the section of "Separator", or other value in any other section?
Any suggestions.
I need to format some hexdump like this:
00010: 02 03 04 05
00020: 02 03 04 08
00030: 02 03 04 08
00010: 02 03 04 05
00020: 02 03 04 05
02 03 04 05
02 03 04 08
to
02 03 04 05
02 03 04 08
02 03 04
02 03 04 05
02 03 04 05
02 03 04 05
02 03 04
remove the address fields, if present
remove any 08 at the end of a paragraph (followed by an empty line)
remove any empty lines
How can this be done using lex? thanks!
It cannot be done directly using lex. Lex is a tokenizer, not a parser.
In all honesty, it can be done using a regular expression and doesn't need the complexity of a scanner generator + parser generator.
If you slurp in the whole file as one string, I think these regular expressions will do what you want (written for Perl, but not tested):
s/^\d{4}: //mg
s/ 08\n\s*\n/\n/g
s/^\s*$//mg