I link to kylin datasource and kylin's data can preview in sqllab,when enter the sql in sqllab,click the run query button,query status always pending and http reponse like
{"query": {"changedOn": 1574996199515.888, "changed_on": "2019-11-29T02:56:39.515888", "dbId": 4, "db": "Kylin", "endDttm": null, "errorMessage": null, "executedSql": null, "id": "5H-DGzWuw", "limit": 1000, "progress": 0, "rows": null, "schema": "DEFAULT", "ctas": false, "serverId": 59, "sql": "select a.country,a.name from DEFAULT.KYLIN_COUNTRY ", "sqlEditorId": "k7GQtMhzk", "startDttm": 1574996199504.259033, "state": "pending", "tab": "Untitled Query", "tempTable": "", "userId": 1, "user": "admin user", "resultsKey": null, "trackingUrl": null, "extra": {}}}
how do I resolve it?
Resovled,the datasource choosed asynchronous query.
Related
I'm currently getting a standard paginated response using Django REST framework, however I need to add a number to each object response in the order it is retrieved. 1,2,3,4,5 etc. It must be calculated each time dynamically if new results are added and retrieved by the user. It would go from this:
{
"next": "http://localhost:8000/api/home/?page=2",
"previous": null,
"count": 105,
"results": [
{
"title": "example1",
"description": "foobar1",
},
{
"title": "example2",
"description": "",
},
...
]
to looking like this:
{
"next": "http://localhost:8000/api/home/?page=2",
"previous": null,
"count": 105,
"results": [
{
"key": 1,
"title": "example1",
"description": "foobar1",
},
{
"key": 2,
"title": "example2",
"description": "",
},
...
]
This would also include the next page in the paginated response, e.g. 21, 22, 23, 24. If a user then refreshes the page and new data is found and retrieved it would recalculate and add key numbers again. How would one go about this?
Thanks!
I am able to create or update an existing Project's image on BIM 360 using Postman but is there any way to get GET the information regarding it? Using the GET method, there is no information regarding images as shown below.
URL:
https://developer.api.autodesk.com/hq/v1/accounts/{{Account_Id}}/projects/{{Project_Id}}
HTTP method:
GET
Result:
{
"id": "**************************",
"account_id": "**************************",
"name": "My Project(2)",
"start_date": "2018-11-06",
"end_date": "2018-12-06",
"value": null,
"currency": "USD",
"status": "active",
"job_number": null,
"address_line_1": null,
"address_line_2": null,
"city": null,
"state_or_province": null,
"postal_code": null,
"country": "United States",
"business_unit_id": null,
"created_at": "2019-01-14T15:31:25.950Z",
"updated_at": "2019-01-18T09:35:27.071Z",
"project_type": "Demonstration Project",
"timezone": null,
"language": "en",
"construction_type": null,
"contract_type": null,
"last_sign_in": "2019-01-15T14:59:08.000Z",
"service_types": "doc_manager,insight,admin"
}
Also using a GET method on the following URL doesn't exist.
https://developer.api.autodesk.com/hq/v1/accounts/{{Account_Id}}/projects/{{Project_Id}}/image
Unfortunately, we provide the PATCH API to upload/update the image of a BIM360 project, but there is not an API to fetch Project Image, this is already tracked internally, I will add more comments for this request.
I'm using the latest ECS Agent v1.22.0 for Windows ECS cluster. I want to try out the new feature https://docs.aws.amazon.com/AmazonECS/latest/developerguide/specifying-sensitive-data.html?shortFooter=true but with the following failed error message:
service XXX was unable to place a task because no container instance met
all of its requirements. The closest matching container-instance YYY is
missing an attribute required by your task.
It doesn't appear ECS Agent logs show any error. My Task has the following attributes:
"requiresAttributes": [
{
"targetId": null,
"targetType": null,
"value": null,
"name": "com.amazonaws.ecs.capability.docker-remote-api.1.28"
},
{
"targetId": null,
"targetType": null,
"value": null,
"name": "ecs.capability.execution-role-ecr-pull"
},
{
"targetId": null,
"targetType": null,
"value": null,
"name": "com.amazonaws.ecs.capability.ecr-auth"
},
{
"targetId": null,
"targetType": null,
"value": null,
"name": "com.amazonaws.ecs.capability.task-iam-role"
},
{
"targetId": null,
"targetType": null,
"value": null,
"name": "ecs.capability.execution-role-awslogs"
},
{
"targetId": null,
"targetType": null,
"value": null,
"name": "com.amazonaws.ecs.capability.logging-driver.awslogs"
},
{
"targetId": null,
"targetType": null,
"value": null,
"name": "ecs.capability.secrets.ssm.environment-variables"
},
{
"targetId": null,
"targetType": null,
"value": null,
"name": "com.amazonaws.ecs.capability.docker-remote-api.1.19"
}
]
fwiw the windows instructions seem to include a gotcha here:
For Windows tasks that are configured to use the awslogs logging driver, you must also set the ECS_ENABLE_AWSLOGS_EXECUTIONROLE_OVERRIDE environment variable on your container instance. This can be done with User Data using the following syntax:
see the instructions here
I know that we can deploy our applications through pivotal cloud foundry.We can push build packs that provide framework and run time support for your applications.I want to create a Jenkins job to list all the build packs available on my cloud foundry.How this can be achieved.Thanxx
You can use the CLI to list the buildpacks: cf buildpacks or you can just query the cloud controller directly (api.system domain) by GETing /v2/buildpacks, however you need to be an authenticated user to make this request.
Even more you can launch curl directly from cf client command:
# cf curl /v2/buildpacks
{
"total_results": 9,
"total_pages": 1,
"prev_url": null,
"next_url": null,
"resources": [
{
"metadata": {
"guid": "b7890a54-f7c5-4973-a3da-e1a48ba6811d",
"url": "/v2/buildpacks/b7890a54-f7c5-4973-a3da-e1a48ba6811d",
"created_at": "2017-05-24T12:53:27Z",
"updated_at": "2017-05-24T12:53:27Z"
},
"entity": {
"name": "binary_buildpack",
"position": 1,
"enabled": true,
"locked": false,
"filename": "binary_buildpack-cached-v1.0.11.zip"
}
},
...
"metadata": {
"guid": "95e3f977-09d1-4b96-96bc-e34125e3b3a2",
"url": "/v2/buildpacks/95e3f977-09d1-4b96-96bc-e34125e3b3a2",
"created_at": "2017-05-24T12:54:03Z",
"updated_at": "2017-05-24T12:54:04Z"
},
"entity": {
"name": "staticfile_buildpack",
"position": 8,
"enabled": true,
"locked": false,
"filename": "staticfile_buildpack-cached-v1.4.5.zip"
}
}
]
}
Doc https://apidocs.cloudfoundry.org/258/
I've been searching SLAPI to order a baremetal servers with partition template for OS.
After I read some articles to order the RAID and configure the partition template data, I found that the ID or description of the OS is required to get the template data.
so, I've tried to get these information of OS with SLAPIs but I couldn't.
For the 'CentOS 7.x (64 bit)', the OS description should be 'linux' but I don't know how to get it with the OS item ID is 5920 and the item price ID is 44988 in dal03.
Here is referred article: Configuring Softlayer Disk Partitions at Order Time
and the response from calling [services/SoftLayer_Hardware_Component_Partition_OperatingSystem]/getAllobjects
[{
"description": "linux",
"id": 1,
"notes": "All flavors"
}, {
"description": "windows",
"id": 2,
"notes": "All RH-based or closely related"
}, {
"description": "freebsd",
"id": 3,
"notes": "FreeBSD, etc.."
}]
and the response of item price (44988) is :
{
"currentPriceFlag": null,
"hourlyRecurringFee": "0",
"id": 44988,
"itemId": 5920,
"laborFee": "0",
"locationGroupId": null,
"onSaleFlag": null,
"oneTimeFee": "0",
"quantity": null,
"recurringFee": "0",
"setupFee": "0",
"sort": 0,
"item": {
"capacity": "0",
"description": "CentOS 7.x (64 bit)",
"id": 5920,
"itemTaxCategoryId": 166,
"keyName": "OS_CENTOS_7_X_64_BIT",
"softwareDescriptionId": 1400,
"units": "N/A",
"upgradeItemId": null,
"itemCategory": {
"categoryCode": "os",
"id": 12,
"name": "Operating System",
"quantityLimit": 0
},
"softwareDescription": {
"controlPanel": 0,
"id": 1400,
"licenseTermValue": null,
"longDescription": "CentOS / CentOS / 7.0-64",
"manufacturer": "CentOS",
"name": "CentOS",
"operatingSystem": 1,
"referenceCode": "CENTOS_7_64",
"upgradeSoftwareDescriptionId": null,
"upgradeSwDescId": null,
"version": "7.0-64",
"virtualLicense": 0,
"virtualizationPlatform": 0,
"requiredUser": "root"
}
}
}
That insformation is not in the API, you have to use your own code in order to pick out the correct template, for that you could use the description of the item e.g.
if the item description contains CentOS or Ubuntu or RedHat use linux
if the item description contains Windows use windows
if the item description contains FreeBSD use freebsd
Regards