Neovim dap C++ clangd - c++

I'm having trouble configuring dap in NeoVim - after executing :lua require’dap’.continue() I get this error:
Path to executable: /home/user/Projects/C++/app/E5108: Error executing lua ...l/share/nvim/site/pack/packer/start/nvim-dap/lua/dap.lua:385: ...nvim/site/pack/packer/start/nvim-dap/lua/dap/session.lua:1295: Error running /hom
e/user/Devtools/vscode-lldb: EACCES: permission denied
stack traceback:
[C]: in function 'trigger_run'
...l/share/nvim/site/pack/packer/start/nvim-dap/lua/dap.lua:385: in function 'run'
...l/share/nvim/site/pack/packer/start/nvim-dap/lua/dap.lua:319: in function 'cb'
...hare/nvim/site/pack/packer/start/nvim-dap/lua/dap/ui.lua:34: in function 'pick_if_many'
...l/share/nvim/site/pack/packer/start/nvim-dap/lua/dap.lua:313: in function 'select_config_and_run'
...l/share/nvim/site/pack/packer/start/nvim-dap/lua/dap.lua:688: in function 'continue'
[string ":lua"]:1: in main chunk
Directory with vscode-llb - user have full access for it.
drwxrwxr-x 17 user user 4,0K dec 30 07:12pm vscode-lldb
debugging.lua
require("dap").adapters.lldb = {
type = "executable",
command = "/home/user/Devtools/vscode-lldb",
name = "lldb",
}
local lldb = {
name = "Launch lldb",
type = "lldb", -- matches the adapter
request = "launch", -- could also attach to a currently running process
program = function()
return vim.fn.input(
"Path to executable: ",
vim.fn.getcwd() .. "/",
"file"
)
end,
cwd = "${workspaceFolder}",
stopOnEntry = false,
args = {},
runInTerminal = false,
}
local dap = require('dap')
dap.configurations.cpp = {
{
name = 'Launch',
type = 'lldb',
request = 'launch',
program = function()
return vim.fn.input('Path to executable: ', vim.fn.getcwd() .. '/', 'file')
end,
cwd = '${workspaceFolder}',
stopOnEntry = false,
args = {},
},
}
dap.configurations.c = dap.configurations.cpp
I can't find a similar problem - so I'm doing something wrong - but what? Anyone have any suggestions.
Change of folder access permissions to application, to vscode-lldb, update of plugins.

Related

Variable passing throwing error in BigQueryInsertJobOperator in Airflow

I have written a BigQueryInsertJobOperator in Airflow to select and insert data to a Big Query table. But I am facing issue with variable passing. I am getting below error while executing Airflow DAG.
File "/home/airflow/.local/lib/python3.7/site-packages/google/cloud/bigquery/job/query.py", line 911, in to_api_repr
configuration = self._configuration.to_api_repr()
File "/home/airflow/.local/lib/python3.7/site-packages/google/cloud/bigquery/job/query.py", line 683, in to_api_repr
query_parameters = resource["query"].get("queryParameters")
AttributeError: 'str' object has no attribute 'get'
Here is my Operator code:
dag = DAG(
'bq_to_sql_operator',
default_args=default_args,
schedule_interval="#daily",
template_searchpath="/opt/airflow/dags/scripts",
user_defined_macros={"BQ_PROJECT": BQ_PROJECT, "BQ_EDW_DATASET": BQ_EDW_DATASET, "BQ_STAGING_DATASET": BQ_STAGING_DATASET},
catchup=False
)
t1 = BigQueryInsertJobOperator(
task_id='bq_write_to_umc_cg_service_agg_stg',
configuration={
"query": "{% include 'umc_cg_service_agg_stg.sql' %}",
"useLegacySql":False,
"allow_large_results":True,
"writeDisposition": "WRITE_TRUNCATE",
"destinationTable": {
'projectId': BQ_PROJECT,
'datasetId': BQ_STAGING_DATASET,
'tableId': UMC_CG_SERVICE_AGG_STG_TABLE_NAME
}
},
params={'BQ_PROJECT': BQ_PROJECT, 'BQ_EDW_DATASET': BQ_EDW_DATASET, 'BQ_STAGING_DATASET': BQ_STAGING_DATASET },
gcp_conn_id=BQ_CONN_ID,
location=BQ_LOCATION,
dag=dag
)
My SQL file looks like as below:
select
faccs2.employer_key employer_key,
faccs2.service_name service_name,
gender,
approximate_age_band,
state,
relationship_map_name,
account_attribute1_name,
account_attribute1_value,
account_attribute2_name,
account_attribute2_value,
account_attribute3_name,
account_attribute3_value,
account_attribute4_name,
account_attribute4_value,
account_attribute5_name,
account_attribute5_value,
count(distinct faccs2.sf_service_id) total_service_count
from `{{params.BQ_PROJECT}}.{{params.BQ_EDW_DATASET}}.fact_account_cg_case_survey` faccs
inner join `{{params.BQ_PROJECT}}.{{params.BQ_EDW_DATASET}}.fact_account_cg_case_service` faccs2 on faccs.sf_case_id = faccs2.sf_case_id
inner join `{{params.BQ_PROJECT}}.{{params.BQ_EDW_DATASET}}.dim_account` da on faccs2.account_key = da.account_key
left join `{{params.BQ_PROJECT}}.{{params.BQ_STAGING_DATASET}}.stg_account_selected_attr_tmp2` attr on faccs.account_key = attr.account_key
where not da.is_test_account_flag
and attr.gender is not null
and coalesce(faccs.case_status,'abc') <> 'Closed as Duplicate'
group by 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16;
Can someone please help me how to fix this issue.
I think that the query configuration should be in a nested document called query:
t1 = BigQueryInsertJobOperator(
task_id='bq_write_to_umc_cg_service_agg_stg',
configuration={
"query": {
"query": "{% include 'umc_cg_service_agg_stg.sql' %}",
"useLegacySql":False,
"allow_large_results":True,
"writeDisposition": "WRITE_TRUNCATE",
"destinationTable": {
'projectId': BQ_PROJECT,
'datasetId': BQ_STAGING_DATASET,
'tableId': UMC_CG_SERVICE_AGG_STG_TABLE_NAME
}
}
},
params={'BQ_PROJECT': BQ_PROJECT, 'BQ_EDW_DATASET': BQ_EDW_DATASET, 'BQ_STAGING_DATASET': BQ_STAGING_DATASET },
gcp_conn_id=BQ_CONN_ID,
location=BQ_LOCATION,
dag=dag
)
With your provided configuration dict, an internal method try to access queryParameters which should be in the dict configuration["query"], but it finds str instead of dict.
Consider below script what I've used at work.
target_date = '{{ ds_nodash }}'
...
# DAG task
t1= bq.BigQueryInsertJobOperator(
task_id = 'sample_task,
configuration = {
"query": {
"query": f"{{% include 'your_query_file.sql' %}}",
"useLegacySql": False,
"queryParameters": [
{ "name": "target_date",
"parameterType": { "type": "STRING" },
"parameterValue": { "value": f"{target_date}" }
}
],
"parameterMode": "NAMED"
},
},
location = 'asia-northeast3',
)
-- in your_query_file.sql, #target_date value is passed as a named parameter.
DECLARE target_date DATE DEFAULT SAFE.PARSE_DATE('%Y%m%d', #target_date);
SELECT ... FROM ... WHERE partitioned_at = target_date;
You can refer to configuration JSON field specification on the link below.
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query#queryrequest
parameterMode string
Standard SQL only. Set to POSITIONAL to use positional (?) query parameters or to NAMED to use named (#myparam) query parameters in this query.
queryParameters[] object (QueryParameter)
jobs.query parameters for Standard SQL queries.
queryParameters is an array of QueryParameter which has following JSON format.
{
"name": string,
"parameterType": {
object (QueryParameterType)
},
"parameterValue": {
object (QueryParameterValue)
}
}
https://cloud.google.com/bigquery/docs/reference/rest/v2/QueryParameter

Terraform variable referencing locals not working

I need to pass the database host name (that is dynamically generated) as an environmental variable into my task definition. I thought I could set locals and have the variable map refer to a local but it seems to not work, as I receive this error: “error="failed to check table existence: dial tcp: lookup local.grafana-db-address on 10.0.0.2:53: no such host". I am able to execute the terraform plan without issues and the code works when I hard code the database host name, but that is not optimal.
My Variables and Locals
//MySql Database Grafana Username (Stored as ENV Var in Terraform Cloud)
variable "username_grafana" {
description = "The username for the DB grafana user"
type = string
sensitive = true
}
//MySql Database Grafana Password (Stored as ENV Var in Terraform Cloud)
variable "password_grafana" {
description = "The password for the DB grafana password"
type = string
sensitive = true
}
variable "db-port" {
description = "Port for the sql db"
type = string
default = "3306"
}
locals {
gra-db-user = var.username_grafana
}
locals {
gra-db-password = var.password_grafana
}
locals {
db-address = aws_db_instance.grafana-db.address
}
locals {
grafana-db-address = "${local.db-address}.${var.db-port}"
}
variable "app_environments_vars" {
type = list(map(string))
description = "Database environment variables needed by Grafana"
default = [
{
"name" = "GF_DATABASE_TYPE",
"value" = "mysql"
},
{
"name" = "GF_DATABASE_HOST",
"value" = "local.grafana-db-address"
},
{
"name" = "GF_DATABASE_USER",
"value" = "local.gra-db-user"
},
{
"name" = "GF_DATABASE_PASSWORD",
"value" = "local.gra-db-password"
}
]
}
Task Definition Variable reference
"environment": ${jsonencode(var.app_environments_vars)},
Thank you to everyone who has helped me with this project. I am new to all of this and could not have done it without help from this community.
You can't use dynamic references in your app_environments_vars. So your default values "value" = "local.grafana-db-address" will never get resolved by TF. If will be just a literal string "local.grafana-db-address".
You have to modify your code so that all these dynamic references in app_environments_vars get populated in locals.
UPDATE
Your app_environments_vars should be local variable for it to be resolved:
locals {
app_environments_vars = [
{
"name" = "GF_DATABASE_TYPE",
"value" = "mysql"
},
{
"name" = "GF_DATABASE_HOST",
"value" = local.grafana-db-address
},
{
"name" = "GF_DATABASE_USER",
"value" = local.gra-db-user
},
{
"name" = "GF_DATABASE_PASSWORD",
"value" = local.gra-db-password
}
]
}
then you pass that local to your template for the task definition.

Power BI Iterative API Loop

I am attempting (and can successfully do so) to connect to an API and loop through several iterations of the API call in order to grab the next_page value, put it in a list and then call the list.
Unfortunately, when this is published to the PBI service I am unable to refresh there and indeed 'Data Source Settings' tells me I have a 'hand-authored query'.
I have attempted to follow Chris Webbs' blog post around the usage of query parameters and relative path, but if I use this I just get a constant loop of the first page that's hit.
The Start Epoch Time is a helper to ensure I only grab data less than 3 months old.
let
iterations = 10000, // Number of MAXIMUM iterations
url = "https://www.zopim.com/api/v2/" & "incremental/" & "chats?fields=chats(*)" & "&start_time=" & Number.ToText( StartEpochTime ),
FnGetOnePage =
(url) as record =>
let
Source1 = Json.Document(Web.Contents(url, [Headers=[Authorization="Bearer MY AUTHORIZATION KEY"]])),
data = try Source1[chats] otherwise null, //get the data of the first page
next = try Source1[next_page] otherwise null, // the script ask if there is another page*//*
res = [Data=data, Next=next]
in
res,
GeneratedList =
List.Generate(
()=>[i=0, res = FnGetOnePage(url)],
each [i]<iterations and [res][Data]<>null,
each [i=[i]+1, res = FnGetOnePage([res][Next])],
each [res][Data])
Lookups
If Source1 exists, but [chats] may not, you can simplify
= try Source1[chats] otherwise null
to
= Source1[chats]?
Plus it you don't lose non-lookup errors.
m-spec-operators
Chris Web Method
should be something closer to this.
let
Headers = [
Accept="application/json"
],
BaseUrl = "https://www.zopim.com", // very important
Options = [
RelativePath = "api/v2/incremental/chats",
Headers = [
Accept="application/json"
],
Query = [
fields = "chats(*)",
start_time = Number.ToText( StartEpocTime )
],
Response = Web.Contents(BaseUrl, Options),
Result = Json.Document(Response) // skip if it's not JSON
in
Result
Here's an example of a reusable Web.Contents function
helper function
let
/*
from: <https://github.com/ninmonkey/Ninmonkey.PowerQueryLib/blob/master/source/WebRequest_Simple.pq>
Wrapper for Web.Contents returns response metadata
for options, see: <https://learn.microsoft.com/en-us/powerquery-m/web-contents#__toc360793395>
Details on preventing "Refresh Errors", using 'Query' and 'RelativePath':
- Not using Query and Relative path cause refresh errors:
<https://blog.crossjoin.co.uk/2016/08/23/web-contents-m-functions-and-dataset-refresh-errors-in-power-bi/>
- You can opt-in to Skip-Test:
<https://blog.crossjoin.co.uk/2019/04/25/skip-test-connection-power-bi-refresh-failures/>
- Debugging and tracing the HTTP requests
<https://blog.crossjoin.co.uk/2019/11/17/troubleshooting-web-service-refresh-problems-in-power-bi-with-the-power-query-diagnostics-feature/>
update:
- MaybeErrResponse: Quick example of parsing an error result.
- Raw text is returned, this is useful when there's an error
- now response[json] does not throw, when the data isn't json to begin with (false errors)
*/
WebRequest_Simple
= (
base_url as text,
optional relative_path as nullable text,
optional options as nullable record
)
as record =>
let
headers = options[Headers]?, //or: ?? [ Accept = "application/json" ],
merged_options = [
Query = options[Query]?,
RelativePath = relative_path,
ManualStatusHandling = options[ManualStatusHandling]? ?? { 400, 404, 406 },
Headers = headers
],
bytes = Web.Contents(base_url, merged_options),
response = Binary.Buffer(bytes),
response_metadata = Value.Metadata( bytes ),
status_code = response_metadata[Response.Status]?,
response_text = Text.Combine( Lines.FromBinary(response,null,null, TextEncoding.Utf8), "" ),
json = Json.Document(response),
IsJsonX = not (try json)[HasError],
Final = [
request_url = metadata[Content.Uri](),
response_text = response_text,
status_code = status_code,
metadata = response_metadata,
IsJson = IsJsonX,
response = response,
json = if IsJsonX then json else null
]
in
Final,
tests = {
WebRequest_Simple("https://httpbin.org", "json"), // expect: json
WebRequest_Simple("https://www.google.com"), // expect: html
WebRequest_Simple("https://httpbin.org", "/headers"),
WebRequest_Simple("https://httpbin.org", "/status/codes/406"), // exect 404
WebRequest_Simple("https://httpbin.org", "/status/406"), // exect 406
WebRequest_Simple("https://httpbin.org", "/get", [ Text = "Hello World"])
},
FinalResults = Table.FromRecords(tests,
type table[
status_code = Int64.Type, request_url = text,
metadata = record,
response_text = text,
IsJson = logical, json = any,
response = binary
],
MissingField.Error
)
in
FinalResults

Pyvmomi configure ESXi 'NTP Client Enabled' check box

Using the below code its possible to update the start up policy of ntpd service in an ESXi server,
con = connect.SmartConnect(host=host, user=user, pwd=pwd)
content = con.RetrieveContent()
cv = content.viewManager.CreateContainerView(
container=content.rootFolder, type=[vim.HostSystem], recursive=True)
for child in cv.view:
child.configManager.serviceSystem.UpdatePolicy(id='ntpd', policy='on')
There is no clue in the service
(vim.host.Service) {
dynamicType = <unset>,
dynamicProperty = (vmodl.DynamicProperty) [],
key = 'ntpd',
label = 'NTP Daemon',
required = false,
uninstallable = false,
running = false,
ruleset = (str) [
'ntpClient'
],
policy = 'off',
sourcePackage = (vim.host.Service.SourcePackage) {
dynamicType = <unset>,
dynamicProperty = (vmodl.DynamicProperty) [],
sourcePackageName = 'esx-base',
description = 'This VIB contains all of the base functionality of
vSphere ESXi.'
}
}
But how to mark the NTP Client Enabled check box for ESXi using Pyvmomi?
VMware version - 6.0.0
host.configManager.firewallSystem.EnableRuleset(id='ntpClient')

How can I update variable values in json body of a request (incrementing of time)

In Postman calls, how can I update variable values in json body of a request with increasing time. I need to call the endpoint for 2048 times. Each call should have the end_time with 5mins difference. I'm unable to convert the value to normal time format.
I wrote this:
var moment = require("moment");
var t = pm.variables.get("t");
pm.environment.set('t', moment().add(1000, 'seconds').valueOf(t));
console.log("t", t);
I see an error:
{
"ErrorCode": "1100",
"Message": "request.end_time: Error converting value \"1581351445025\" to type 'System.TimeSpan'. Path 'end_time', line 10, position 29."
}
Sample request: (In Body)
{
"monday": true,
"tuesday": true,
"wednesday": true,
"thursday": true,
"friday": true,
"saturday": false,
"sunday": false,
"start_time": "7:30:00",
"end_time": "{{t}}",
"start_date": "2020-01-23",
"end_date": "2020-05-23"
}
In Pre-req script view
var moment = require("moment");
var t = pm.variables.get("t");
console.log("t: " + t);
var newT = moment().add(1000, 'seconds').valueOf(t);
console.log("newT: " + newT);
postman.setEnvironmentVariable("newT", newT);
Then your request body should just change to use the new variable {{newT}}
Not sure why but using pm.environment.set wasn't setting the environment at all but postman.setEnvironmentVariable seems to work.