I have encountered a couple of issues when using supersetEmbeddedSdk (apache-superset==2.0.0):
I cannot hide filters with
filters: {
expanded: false,
visible: false,
}
in the network communication I can see that there is a call to http://localhost:8088/api/v1/dashboard/2/filter_state?tab_id=1 which results in 500 error {msg: "Missing Authorization Header"}
in the logs I can see
2022-10-04 18:24:22,760:ERROR:root:'GuestUser' object has no attribute 'get_user_id'
Traceback (most recent call last):
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/flask_appbuilder/api/__init__.py", line 86, in wraps
return f(self, *args, **kwargs)
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/superset/utils/log.py", line 245, in wrapper
value = f(*args, **kwargs)
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/superset/dashboards/filter_state/api.py", line 98, in post
return super().post(pk)
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/superset/views/base_api.py", line 83, in wraps
return f(self, *args, **kwargs)
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/superset/temporary_cache/api.py", line 76, in post
key = self.get_create_command()(args).run()
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/superset/temporary_cache/commands/create.py", line 35, in run
return self.create(self._cmd_params)
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/superset/dashboards/filter_state/commands/create.py", line 41, in create
entry: Entry = {"owner": get_owner(actor), "value": value}
File "/Users/sebastiankruk/Development/private/paiku/.envs/lib/python3.9/site-packages/superset/key_value/utils.py", line 70, in get_owner
return user.get_user_id() if not user.is_anonymous else None
AttributeError: 'GuestUser' object has no attribute 'get_user_id'
However the chart renders correctly and I was even able to expand the iframe with the following code (my app is written in Flask):
supersetEmbeddedSdk.embedDashboard({
id: '{{ dashboard_id }}',
supersetDomain: '{{ superset_public_url }}',
mountPoint: document.getElementById('{{ chart_wrapper }}'),
fetchGuestToken: () => '{{ token }}',
dashboardUiConfig: {
hideTitle: true,
hideTab: true,
hideChartControls: true,
filters: {
expanded: false,
visible: false,
}
}
}).then(embedVSC => {
const size = embedVSC.getScrollSize();
if (size.width !== 0) {
const iframe = document.querySelector("#{{ chart_wrapper }} > iframe");
iframe.setAttribute("width", "100%");
iframe.setAttribute("height", "450");
}
})
and last but certainly not least … each time I enter a page with the chart my session is being terminated and I have to re-login to my app
I tried to skip the filters section but the same call was being made with the same result (500).
What am I missing?
Related
I am having problems using the new Airflow operator BigQueryCreateExternalTableOperator within Google-Composer:
Question 1
After creating an Airflow task this is happening :
AttributeError: 'BigQueryCreateExternalTableOperator' object has no attribute 'bucket'
However, as I am querying a gsheets file why it is looking for bucket argument? I am getting crazy trying to find what is happening! According to the docs it is optional!
Sample Code
task1 = BigQueryCreateExternalTableOperator(
task_id="task1_externaltable",
table_resource={
"tableReference": {
"projectId": projectid,
"datasetId": datasetid,
"tableId": tableid,
},
"schema": schema_fields,
"externalDataConfiguration": {
"sourceFormat": "GOOGLE_SHEETS",
"autodetect": False,
"compression": "NONE",
"googleSheetsOptions": {
"skipLeadingRows": 1,
"range": gsheets_tab_name,
},
"sourceUris": gsheets_url,
},
},
)
Following Elad's suggestion, error traceback!:
AttributeError: 'BigQueryCreateExternalTableOperator' object has no attribute 'bucket'
[2022-03-18, 14:45:38 UTC] {taskinstance.py:1268} INFO - Marking task as UP_FOR_RETRY. dag_id=trm_analytics_attribution_collision_checker_dag, task_id=create_manual_attribution_2_external_table, execution_date=20220318T144520, start_date=20220318T144536, end_date=20220318T144538
[2022-03-18, 14:45:38 UTC] {standard_task_runner.py:89} ERROR - Failed to execute job 444 for task create_manual_attribution_2_external_table
Traceback (most recent call last):
File "/opt/python3.8/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/utils/cli.py", line 94, in wrapper
return f(*args, **kwargs)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 302, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 107, in _run_task_by_selected_method
_run_raw_task(args, ti)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 180, in _run_raw_task
ti._run_raw_task(
File "/opt/python3.8/lib/python3.8/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1330, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1420, in _execute_task_with_callbacks
self.render_templates(context=context)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1995, in render_templates
self.task.render_template_fields(context)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 1061, in render_template_fields
self._do_render_template_fields(self, self.template_fields, context, jinja_env, set())
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 1072, in _do_render_template_fields
content = getattr(parent, attr_name)
AttributeError: 'BigQueryCreateExternalTableOperator' object has no attribute 'bucket'
[2022-03-18, 14:45:38 UTC] {local_task_job.py:154} INFO - Task exited with return code 1
[2022-03-18, 14:45:38 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
In Python I am writing query results (show how many queries are being run within a project) into a table ,which upon insertion will take the values entered and sends it to Stackdriver as a custom time series metric. Upon timeseries creation it returns an error saying google.api_core.exceptions.InvalidArgument: 400 One or more TimeSeries could not be written: Metrics cannot be written to bigquery_project. See https://cloud.google.com/monitoring/custom-metrics/creating-metrics#which-resource for a list of writable resource types.: timeSeries[0].
The resource type I am using is bigquery_project but I do not believe that is the cause of my errors. I think that there is too many values at one point per request (ex.the query response returns 20 queries at 12 :00) which causes the error but if that is the cause how do I delay the time for each query response.
Here is some code :
def get_time_series(result):
series = monitoring_v3.types.TimeSeries()
series.metric.type = 'custom.googleapis.com/Blah/Blah_blah_blah'
series.resource.type = 'bigquery_project'
series.resource.labels['project_id'] = "project-name"
series.resource.labels['region'] = 'us-central1'
point = series.points.add()
point.interval.end_time.seconds = int(time.time())
try:
point.value.int64_value = result
except TypeError as type_err:
print(f"Something went wrong ..")
now = datetime.datetime.utcnow()
point.value.int64_value = int(
(now - now.replace(hour=0, minute=0, second=0, microsecond=0)).total_seconds() * 1000)
return series
def query_stackoverflow():
client_bq = bigquery.Client(project="project-name")
client = monitoring_v3.MetricServiceClient()
query_job = client_bq.query("""
SELECT
job_id,
creation_time,
query,
total_bytes_processed
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
WHERE project_id ='project-name'
AND creation_time BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 3 DAY)
AND CURRENT_TIMESTAMP()
ORDER BY creation_time DESC
LIMIT 100""")
results = query_job.result() # Waits for job to complete.
rows = list(results)
response =[]
for row in rows:
if row.total_bytes_processed is not None:
cost_dollars = (row.total_bytes_processed / 1024 ** 4) * 5
response.append(f"Creation_Time :{row.creation_time} | Estimated_Cost : {cost_dollars}".format(row.creation_time, cost_dollars))
else:
cost_dollars = (row.total_bytes_processed / 1024 ** 4) * 5
response.append(f"Creation_Time :{row.creation_time} | Estimated_Cost : {cost_dollars}".format(row.creation_time, cost_dollars))
# print(response)
timeseries = []
for x in range(0, len(list(response))):
timeseries.append(get_time_series(response))
print(timeseries)
client.create_time_series("projects/project-name", timeseries)
print("Blah metrics are successfully sent")
print("FINISHED")
StackTrace :
Traceback (most recent call last):
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\google\api_core\grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\grpc\_channel.py", line 690, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\grpc\_channel.py", line 592, in _end_unary_response_blocking
raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "One or more TimeSeries could not be written: Metrics cannot be written to bigquery_project. See https://cloud.google.com/monitoring/custom-metrics/creating-metrics#which-resource for a list of writable resource types.: timeSeries[0]"
debug_error_string = "{"created":"#1585841627.232000000","description":"Error received from peer ipv4:172.217.10.138:443","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"One or more TimeSeries could not be written: Metrics cannot be written to bigquery_project. See https://cloud.google.com/monitoring/custom-metrics/creating-metrics#which-resource for a list of writable resource types.: timeSeries[0]","grpc_status":3}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:/Users/MalCode/PycharmProjects/Vulnerabilities/test.py", line 83, in <module>
query_stackoverflow()
File "C:/Users/MalCode/PycharmProjects/Vulnerabilities/test.py", line 76, in query_stackoverflow
client.create_time_series("projects/project-name", timeseries)
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\google\cloud\monitoring_v3\gapic\metric_service_client.py", line 1039, in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\google\api_core\gapic_v1\method.py", line 143, in __call__
return wrapped_func(*args, **kwargs)
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\google\api_core\retry.py", line 286, in retry_wrapped_func
on_error=on_error,
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\google\api_core\retry.py", line 184, in retry_target
return target()
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\google\api_core\timeout.py", line 214, in func_with_timeout
return func(*args, **kwargs)
File "C:\Users\MalCode\PycharmProjects\Vulnerabilities\venv\lib\site-packages\google\api_core\grpc_helpers.py", line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 One or more TimeSeries could not be written: Metrics cannot be written to bigquery_project. See https://cloud.google.com/monitoring/custom-metrics/creating-metrics#which-resource for a list of writable resource types.: timeSeries[0]
Regarding to your error message, in the documentation related to the creation of custom metrics, there is a list of the possible Monitored resource types available for its creation. However, the resource type: 'bigquery_project' is not found within them.
Instead, you can use the resource types: 'global', 'generic_node' and 'generic_task' in case there is no suitable resource type, in this documentation you can find the differences between them and choose the one that best suits your case.
I have manually installed the full OpenEdx stack, but when trying to edit an existing unit (creating one works), I get the following error:
Here is the exception stack from the logs:
Oct 3 16:47:51 ip-xxx [service_variant=cms][contentstore.views.preview][env:sandbox] WARNING [ip-xxx 17550] [preview.py:318] - Unable to render author_view for <VerticalBlockWithMixins #9018 graded=False, annotation_token_secret=u'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx', hide_after_due=False, giturl=None, edxnotes=False, source_file=None, course_edit_method=u'Studio', default_tab=None, children=[BlockUsageLocator(CourseLocator(u'Company', u'TD101', u'2018', None, None), u'html', u'8ca5de5b3eae4d95bae0df2c4c67ccce')], in_entrance_exam=False, showanswer=u'finished', display_name=u'Unit', video_speed_optimizations=True, graceperiod=None, format=None, due=None, start=datetime.datetime(2018, 5, 3, 17, 11, tzinfo=tzutc()), xml_attributes={u'filename': [u'vertical/693fb3e8136844e7a3e8c844ece0c0ae.xml', u'vertical/693fb3e8136844e7a3e8c844ece0c0ae.xml']}, days_early_for_beta=None, visible_to_staff_only=False, parent=BlockUsageLocator(CourseLocator(u'Company', u'TD101', u'2018', None, None), u'sequential', u'd7f0672550ec45f9adb6a3e504b6c8fc'), tags=[], matlab_api_key=None, xqa_key=None, is_entrance_exam=False, annotation_storage_url=u'http://your_annotation_storage.com', use_latex_compiler=False, video_bumper={}, show_correctness=u'always', static_asset_path=u'', hide_from_toc=False, show_reset_button=False, name=None, group_access={}, video_auto_advance=False, rerandomize=u'never', user_partitions=[], chrome=None, edxnotes_visibility=True, position=None, max_attempts=None, self_paced=False>
Traceback (most recent call last):
File "/edx/app/edxapp/edx-platform/cms/djangoapps/contentstore/views/preview.py", line 316, in get_preview_fragment
fragment = module.render(preview_view, context)
File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/xblock/core.py", line 202, in render
return self.runtime.render(self, view, context)
File "/edx/app/edxapp/edx-platform/common/lib/xmodule/xmodule/x_module.py", line 1903, in render
return self.__getattr__('render')(block, view_name, context)
File "/edx/app/edxapp/edx-platform/common/lib/xmodule/xmodule/x_module.py", line 1310, in render
return super(MetricsMixin, self).render(block, view_name, context=context)
File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/xblock/runtime.py", line 812, in render
updated_frag = self.wrap_xblock(block, view_name, frag, context)
File "/edx/app/edxapp/edx-platform/common/lib/xmodule/xmodule/x_module.py", line 1262, in wrap_xblock
frag = wrapper(block, view, frag, context)
File "/edx/app/edxapp/edx-platform/openedx/core/lib/xblock_utils/__init__.py", line 259, in replace_static_urls
static_asset_path=static_asset_path
File "/edx/app/edxapp/edx-platform/common/djangoapps/static_replace/__init__.py", line 218, in replace_static_urls
return process_static_urls(text, replace_static_url, data_dir=static_asset_path or data_directory)
File "/edx/app/edxapp/edx-platform/common/djangoapps/static_replace/__init__.py", line 130, in process_static_urls
text
File "/edx/app/edxapp/venvs/edxapp/lib/python2.7/re.py", line 155, in sub
return _compile(pattern, flags).sub(repl, string, count)
File "/edx/app/edxapp/edx-platform/common/djangoapps/static_replace/__init__.py", line 122, in wrap_part_extraction
return replacement_function(original, prefix, quote, rest)
File "/edx/app/edxapp/edx-platform/common/djangoapps/static_replace/__init__.py", line 188, in replace_static_url
url = staticfiles_storage.url(rest)
File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/django/contrib/staticfiles/storage.py", line 162, in url
return self._url(self.stored_name, name, force)
File "/edx/app/edxapp/edx-platform/openedx/core/djangoapps/theming/storage.py", line 180, in _url
return super(ThemeCachedFilesMixin, self)._url(hashed_name_func, processed_asset_name, force, hashed_files)
File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/django/contrib/staticfiles/storage.py", line 141, in _url
hashed_name = hashed_name_func(*args)
File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/django/contrib/staticfiles/storage.py", line 376, in stored_name
raise ValueError("The name '%s' could not be hashed with %r." % (name, self))
ValueError: The name 'bundles/js/factories/xblock_validation.788ba995ff95.js' could not be hashed with <openedx.core.storage.ProductionStorage object at 0x7f524dd4b290>.
Any idea how to solve that, it seems to be an issue inside Django, but I can't find any information about it.
Getting this strange error when I run my code in scrapy cloud. Not sure how to debug it. There is no reference to line in the spider code.
I assume it is about saving an item and smth general as no url is indicated. Moreover the spider runs ok and provides results after this error.
Any help is appreciated.
[scrapy.utils.signal] Error caught on signal handler: <bound method ?.item_scraped of <sh_scrapy.extension.HubstorageExtension object at 0x7fcdc33abf50>> Less
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/local/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/extension.py", line 45, in item_scraped
item = self.exporter.export_item(item)
File "/usr/local/lib/python2.7/site-packages/scrapy/exporters.py", line 304, in export_item
result = dict(self._get_serialized_fields(item))
File "/usr/local/lib/python2.7/site-packages/scrapy/exporters.py", line 75, in _get_serialized_fields
value = self.serialize_field(field, field_name, item[field_name])
File "/usr/local/lib/python2.7/site-packages/scrapy/exporters.py", line 284, in serialize_field
return serializer(value)
File "/usr/local/lib/python2.7/site-packages/scrapy/exporters.py", line 290, in _serialize_value
return dict(self._serialize_dict(value))
File "/usr/local/lib/python2.7/site-packages/scrapy/exporters.py", line 300, in _serialize_dict
key = to_bytes(key) if self.binary else key
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/python.py", line 117, in to_bytes
'object, got %s' % type(text).__name__)
TypeError: to_bytes must receive a unicode, str or bytes object, got instance
Here is the item I yield:
{response.meta['url'] : {"rss_categories": [], "int_links" : int_links, "ext_links" : ext_links, "all_links" : len(all_links), "email" : email, "url" : url, "social_media":[{"twitter" : twitter,"facebook" : facebook, "instagram" : instagram, "pinterest" : pinterest, "youtube" : youtube }],"rss_atom" : rss_atom, "title" : title, "MetaDescription" : MetaDescription, "descr" : descr, "keywords" :[{}]}}
I'm using the Elastic search extension to Python, trying to query specific path.
Here is my wrapped query:
{
"size": 1000,
"query": {
"filtered": {
"filter": {
"bool": {
"must": [
{
"term": {
"Path": "c:\\myfolder\\myfile.txt"
}
}
]
}
}
}
}
}
Which works fine in kopf plugin.
Here is my Python code:
from elasticsearch import Elasticsearch
es = Elasticsearch(hosts=['my_server'])
index = "my_index"
query = '{"size":1000,"query":{"filtered":{"filter":{"bool":{"must":[{"term":{"Path":"c:\\myfolder\\myfile.txt"}}]}}}}}'
response = es.search(index=index, body=query)
For some reason I'm getting this error (which will not occur without the backslash):
/usr/local/lib/python2.7/dist-packages/elasticsearch/client/utils.py",
line 69, in _wrapped
return func(*args, params=params, **kwargs) File "/usr/local/lib/python2.7/dist-packages/elasticsearch/client/init.py",
line 530, in search
doc_type, '_search'), params=params, body=body) File "/usr/local/lib/python2.7/dist-packages/elasticsearch/transport.py",
line 329, in perform_request
status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout) File
"/usr/local/lib/python2.7/dist-packages/elasticsearch/connection/http_urllib3.py",
line 106, in perform_request
self._raise_error(response.status, raw_data) File "/usr/local/lib/python2.7/dist-packages/elasticsearch/connection/base.py",
line 105, in _raise_error
raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)
elasticsearch.exceptions.RequestError
This problem happens only when there are backslashes.
Note: I'm working on Ubuntu.
Thanks in advance.
Try changing the "Path" to c:\\\\myfolder\\\\myfile.txt.