I have a log file, and I want to get the content using pattern
Log file will look like this
2019-05-15 16:40 +07:00: data { data:
[ { audio_incremental_num: 1,
session_id: 'openrJEe7A_1557912549',
stream_time: 88,
duration: 291,
audio_id: '749f7c75-9fe1-4dbc-b5d8-770aadfe94bc'
version: '1.2' },
{ audio_incremental_num: 1,
session_id: 'openrJEe7A_1557912549',
stream_time: 88,
duration: 291,
audio_id: '749f7c75-9fe1-4dbc-b5d8-770aadfe94bc'
version: '1.2' }] }
2019-05-15 16:50 +07:00: data { data:
[ { audio_incremental_num: 1,
session_id: 'openrJEe7A_1557912549',
stream_time: 88,
duration: 291,
audio_id: '749f7c75-9fe1-4dbc-b5d8-770aadfe94bc'
version: '1.2' },
{ audio_incremental_num: 1,
session_id: 'openrJEe7A_1557912549',
stream_time: 88,
duration: 291,
audio_id: '749f7c75-9fe1-4dbc-b5d8-770aadfe94bc'
version: '1.2' }] }
I have tried using these but no luck
grep -zo '2019-05-[0-9][1-9] [0-9][0-9]:[0-9][0-9] +07:00: data { data:[[:space:]]'
grep -P '2019-05-[0-9]{2} [0-9]{2}:[0-9]{2} \+07:00: data { data:(\s.*)*.*'
Note: My log file actually is mixed with other log string content, so its not 100% JSON log
Your log file look like a json format you can use jq in bash to parse that is very useful check this link Working with JSON in bash using jq
Related
Hi I am trying to copy only one item from a list into another list.
Example 1:
listFinalDevices = listDevices.map((index) => index).toList();
works a bit, but this overwrites the listFinalDevice every time, I need to add the selected Item from the listDevice. As a kind of favorite device list.
Example 2:
listFinalDevices.insertAll(listFinalDevices.length,listDevices.map((index) => index).toList());
This copy the complete list but I need only the over index referenced item.
Can someone give me a link to an example or what are the keywords for what I have to search.
UPDADTE:
To make it more clearer, currently I have the following data in the list named listDevices:
[ScanResult{device: BluetoothDevice{id: 4D:55:F7:CE:03:FA, name: , type: BluetoothDeviceType.unknown, isDiscoveringServices: false, _services: [], advertisementData: AdvertisementData{localName: , txPowerLevel: null, connectable: true, manufacturerData: {}, serviceData: {0000fd6f-0000-1000-8000-00805f9b34fb: [215, 226, 39, 186, 231, 145, 9, 162, 217, 184, 33, 163, 133, 92, 23, 221, 40, 117, 217, 176]}, serviceUuids: [0000fd6f-0000-1000-8000-00805f9b34fb]}, rssi: -65}, ScanResult{device: BluetoothDevice{id: 00:80:E1:21:C4:B5, name: P2PSRV1, type: BluetoothDeviceType.le, isDiscoveringServices: false, _services: [], advertisementData: AdvertisementData{localName: P2PSRV1, txPowerLevel: null, connectable: true, manufacturerData: {33537: [0, 0, 32, 0, 0, 128, 225, 33, 196, 181]}, serviceData: {}, serviceUuids: []}, rssi: -35}, ScanResult{device: BluetoothDevice{id: 60:29:4D:9B:AC:52, name: , type: BluetoothDeviceType.unknown, isDiscoveringServices: false, _services: [], advertisementData: Adver
The target is, that I chose some Devices and put this to a kind of favorite list which should named with listFinalDevices:
[ScanResult{device: BluetoothDevice{id: 00:80:E1:21:C4:B5, name: P2PSRV1, type: BluetoothDeviceType.le, isDiscoveringServices: false, _services: [], advertisementData: AdvertisementData{localName: P2PSRV1, txPowerLevel: null, connectable: true, manufacturerData: {33537: [0, 0, 32, 0, 0, 128, 225, 33, 196, 181]}, serviceData: {}, serviceUuids: []}, rssi: -35}, ScanResult{device: BluetoothDevice{id: C5:9F:97:96:4A:A9, name: MX Anywhere 2S, type: BluetoothDeviceType.le, isDiscoveringServices: false, _services: [], advertisementData: AdvertisementData{localName: MX Anywhere 2S, txPowerLevel: 4, connectable: false, manufacturerData: {}, serviceData: {}, serviceUuids: [00001812-0000-1000-8000-00805f9b34fb]}, rssi: -50}] is not empty
listFinalDevices.add(listDevices[index])
Could you be clearer? You wanted to add object at index from another list to final list, right?
or maybe:
listFinalDevices.add(listDevices[listDevices.indexWhere((el) => el == index)])
I am trying to invoke a SageMaker enpoint from AWS Lambda using a lambda function.
This is a sample API call to the endpoint from SageMaker Studio, working as expected:
here's my Lambda function (inspired from documentation):
import os
import io
import boto3
import json
ENDPOINT_NAME = 'iris-autoscale-6'
runtime= boto3.client('runtime.sagemaker')
def lambda_handler(event, context):
# print("Received event: " + json.dumps(event, indent=2))
payload = json.loads(json.dumps(event))
print(payload)
response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME, ContentType='application/json', Body=payload)
print(response)
result = json.loads(response['Body'].read().decode())
print(result)
return result
My error message:
Test Event Name
ProperTest
Response
{
"errorMessage": "Parameter validation failed:\nInvalid type for parameter Body, value: {'sepal_length': [5.1, 4.9, 4.7, 4.6, 5], 'sepal_width': [3.5, 3, 3.2, 3.1, 3.6], 'petal_length': [1.4, 1.4, 1.3, 1.5, 1.4], 'petal_width': [0.2, 0.2, 0.2, 0.2, 0.2]}, type: <class 'dict'>, valid types: <class 'bytes'>, <class 'bytearray'>, file-like object",
"errorType": "ParamValidationError",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 17, in lambda_handler\n response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME, ContentType='application/json', Body=payload)\n",
" File \"/var/runtime/botocore/client.py\", line 386, in _api_call\n return self._make_api_call(operation_name, kwargs)\n",
" File \"/var/runtime/botocore/client.py\", line 678, in _make_api_call\n api_params, operation_model, context=request_context)\n",
" File \"/var/runtime/botocore/client.py\", line 726, in _convert_to_request_dict\n api_params, operation_model)\n",
" File \"/var/runtime/botocore/validate.py\", line 319, in serialize_to_request\n raise ParamValidationError(report=report.generate_report())\n"
]
}
Function Logs
START RequestId: 70278b9f-f75e-4ac9-a827-7ad35d162512 Version: $LATEST
{'sepal_length': [5.1, 4.9, 4.7, 4.6, 5], 'sepal_width': [3.5, 3, 3.2, 3.1, 3.6], 'petal_length': [1.4, 1.4, 1.3, 1.5, 1.4], 'petal_width': [0.2, 0.2, 0.2, 0.2, 0.2]}
[ERROR] ParamValidationError: Parameter validation failed:
Invalid type for parameter Body, value: {'sepal_length': [5.1, 4.9, 4.7, 4.6, 5], 'sepal_width': [3.5, 3, 3.2, 3.1, 3.6], 'petal_length': [1.4, 1.4, 1.3, 1.5, 1.4], 'petal_width': [0.2, 0.2, 0.2, 0.2, 0.2]}, type: <class 'dict'>, valid types: <class 'bytes'>, <class 'bytearray'>, file-like object
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 17, in lambda_handler
response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME, ContentType='application/json', Body=payload)
File "/var/runtime/botocore/client.py", line 386, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/var/runtime/botocore/client.py", line 678, in _make_api_call
api_params, operation_model, context=request_context)
File "/var/runtime/botocore/client.py", line 726, in _convert_to_request_dict
api_params, operation_model)
File "/var/runtime/botocore/validate.py", line 319, in serialize_to_request
raise ParamValidationError(report=report.generate_report())
END RequestId: 70278b9f-f75e-4ac9-a827-7ad35d162512
REPORT RequestId: 70278b9f-f75e-4ac9-a827-7ad35d162512 Duration: 26.70 ms Billed Duration: 27 ms Memory Size: 128 MB Max Memory Used: 76 MB Init Duration: 343.10 ms
Here's the policy attached to the lambda function:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "sagemaker:InvokeEndpoint",
"Resource": "arn:aws:sagemaker:ap-south-1:<my-account-id>:endpoint/iris-autoscale-6"
}
]
}
The issue is that your payload has invalid format. It should be one of:
<class 'bytes'>, <class 'bytearray'>, file-like object
The following should address the error (note: you may have many other issues in your code):
payload = json.dumps(event)
print(payload)
response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME, ContentType='application/json', Body=payload.encode())
I would like to add few properties to root of input data. Assume we have
{
"f1": 1,
"cx": {
"cxf1": 113,
"cxf2": "f23"
},
"cx2": {
"cxf12": 11,
"cxf22": "f2"
}
}
I would like to create CDK Pass step to add simple value and complex value to the root of input and pass combined data further. I should have output as:
{
"f1": 1,
"cx": {
"cxf1": 113,
"cxf2": "f23"
},
"cx2": {
"cxf12": 11,
"cxf22": "f2"
},
"out1": "simple",
"out2complex": {
"f1A": 111,
"f2A": "f22"
}
}
Whatevr combination of inputPath, outputhPath, resultpath i try it does not work. It works only when result path is specified and my result will go to path as complex element.
I assume it is by design. If I specify only result, it means it will overwrite input.
Is there a way to add simple property and complex property to the root of input object and pass it further?
We need to pass the output of the pass step with resultPath
Lets say pass step output is a string simple , it will be appended to existing input Json with key out1 with resultPath: "$.out1"
const firstStep = new stepfunctions.Pass(this, "Build Out1", {
result: stepfunctions.Result.fromString("simple"),
resultPath: "$.out1",
});
const secondStep = new stepfunctions.Pass(this, "Build out2complex", {
result: stepfunctions.Result.fromObject({
f1A: 111,
f2A: "f22",
}),
resultPath: "$.out2complex",
});
const def = firstStep.next(secondStep);
new stepfunctions.StateMachine(this, "StateMachine", {
definition: def,
});
Input:
{
"f1": 1,
"cx": {
"cxf1": 113,
"cxf2": "f23"
},
"cx2": {
"cxf12": 11,
"cxf22": "f2"
}
}
Output:
{
"f1": 1,
"cx": {
"cxf1": 113,
"cxf2": "f23"
},
"cx2": {
"cxf12": 11,
"cxf22": "f2"
},
"out1": "simple",
"out2complex": {
"f1A": 111,
"f2A": "f22"
}
}
I'm trying to submit job with c5d.18xlarge (vcpus: 72, memory: 144000) but it never coming up till 20 mins even my maximum vcpus of compute enviroment is 256?
let containerInfo = {
jobDefinition: '',
jobName: '',
jobQueue: ,
parameters: {
companyId: params.companyId
},
containerOverrides: {
vcpus: 72,
memory: 144000
}
}
Is it possible to search in journalctl via metadata with patterns. What I am doing right now is to search like journalctl CONTAINER_NAME=cranky.hello --lines=100 -f. But what I want to achieve is to search everything after that '.'. Some search pattern like journalctl CONTAINER_NAME=cranky.* --lines=100 -f. Which will also search CONTAINER_NAME metadata like:
cranky.world
cranky.alive
Below are example of output when journalctl is executed:
journalctl CONTAINER_NAME=cranky.hello --lines=100 -f
Oct 17 14:33:35 lottery-staging docker[55587]: chdir: /usr/src/app
Oct 17 14:33:35 lottery-staging docker[55587]: daemon: False
Oct 17 14:33:35 lottery-staging docker[55587]: raw_env: []
Oct 17 14:33:35 lottery-staging docker[55587]: pidfile: None
Oct 17 14:33:35 lottery-staging docker[55587]: worker_tmp_dir: None
journalctl CONTAINER_NAME=cranky.hello --lines=100 -f -o json
{ "__CURSOR" : "s=d98b3d664a71409d9a4d6145b0f8ad93;i=731e;b=2f9d75ec91044d52b8c5e5091370bcf7;m=285b067a063;t=55bbf0361352a;x=64b377c33c8fba96", "__REALTIME_TIMESTAMP" : "1508250837136682", "__MONOTONIC_TIMESTAMP" : "2773213487203", "_BOOT_ID" : "2f9d75ec91044d52b8c5e5091370bcf7", "CONTAINER_TAG" : "", "_TRANSPORT" : "journal", "_PID" : "55587", "_UID" : "0", "_GID" : "0", "_COMM" : "docker", "_EXE" : "/usr/bin/docker", "_CMDLINE" : "/usr/bin/docker daemon -H unix:///var/run/docker.sock -H tcp://0.0.0.0:2375 --userland-proxy=false --tlscert /etc/dockercloud/agent/cert.pem --tlskey /etc/dockercloud/agent/key.pem --tlscacert /etc/dockercloud/agent/ca.pem --tlsverify --log-driver journald", "_SYSTEMD_CGROUP" : "/", "_SELINUX_CONTEXT" : [ 117, 110, 99, 111, 110, 102, 105, 110, 101, 100, 10 ], "_MACHINE_ID" : "0a80624bd4c45a792b0a857c59a858d6", "_HOSTNAME" : "lottery-staging", "PRIORITY" : "6", "MESSAGE" : "Running migrations:", "CONTAINER_ID_FULL" : "c8f60546e9d50f034f364259c409760b3390d979d57a773eccd8d852e1c3553f", "CONTAINER_NAME" : "ghost-1.lottery-staging-stack.c6118be4", "CONTAINER_ID" : "c8f60546e9d5", "_SOURCE_REALTIME_TIMESTAMP" : "1508250837135650" }
{ "__CURSOR" : "s=d98b3d664a71409d9a4d6145b0f8ad93;i=731f;b=2f9d75ec91044d52b8c5e5091370bcf7;m=285b067a2a2;t=55bbf0361376a;x=6c87fea4ea155d00", "__REALTIME_TIMESTAMP" : "1508250837137258", "__MONOTONIC_TIMESTAMP" : "2773213487778", "_BOOT_ID" : "2f9d75ec91044d52b8c5e5091370bcf7", "CONTAINER_TAG" : "", "_TRANSPORT" : "journal", "_PID" : "55587", "_UID" : "0", "_GID" : "0", "_COMM" : "docker", "_EXE" : "/usr/bin/docker", "_CMDLINE" : "/usr/bin/docker daemon -H unix:///var/run/docker.sock -H tcp://0.0.0.0:2375 --userland-proxy=false --tlscert /etc/dockercloud/agent/cert.pem --tlskey /etc/dockercloud/agent/key.pem --tlscacert /etc/dockercloud/agent/ca.pem --tlsverify --log-driver journald", "_SYSTEMD_CGROUP" : "/", "_SELINUX_CONTEXT" : [ 117, 110, 99, 111, 110, 102, 105, 110, 101, 100, 10 ], "_MACHINE_ID" : "0a80624bd4c45a792b0a857c59a858d6", "_HOSTNAME" : "lottery-staging", "PRIORITY" : "6", "MESSAGE" : " No migrations to apply.", "CONTAINER_ID_FULL" : "c8f60546e9d50f034f364259c409760b3390d979d57a773eccd8d852e1c3553f", "CONTAINER_NAME" : "ghost-1.lottery-staging-stack.c6118be4", "CONTAINER_ID" : "c8f60546e9d5", "_SOURCE_REALTIME_TIMESTAMP" : "1508250837135667" }
journalctl does not accept patterns for anything other than unit names (in the -u argument). Depending on your needs, you could perform some filtering using JSON output and grep, as in:
journalctl -u docker -o json -n1000 | grep 'CONTAINER_NAME.*cranky\.'