I'm trying to load an excel worksheet into PowerBI Desktop and when it reaches around 50 MB, it fails with the error "Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: [DataFormat.Error] The input couldn't be recognized as a valid Excel document.. '."
If I reduce the size of that worksheet to about half, it loads fine.
The other worksheets in that doc (all are much smaller) load perfectly. I tried loading the failing worksheet first and get the same error. If I load the other worksheets first, they load fine but when I add the failing worksheet, I get the same error for all the worksheets.
Can someone suggest anything that I could try to resolve this?
Thanks in advance
V
As the error message indicates, the problem may be caused by a formatting error.Get a list of files found in a OneDrive folder, you can try again by following these steps:
Get a list of files (URLs) via
Source=SharePoint.Tables("https://your.sharepoint.com/personal/john_doe_acme_com/",[ApiVersion=15])
Expand Documents "Table"
Expand "File.LinkingUrl" and split by "?" to get a clean URL
Do filter the desired folder (via URL or Folder.Name or other attribution)
Add a Custom Column = Excel.Workbook(Web.Contents([File.LinkingUrl))
Click this link for detailed information:
https://community.powerbi.com/t5/Desktop/Error-when-connecting-folders-containing-excel-files/m-p/2343510#M845219
Related
We have run a PowerBI subscription to generate visualisations report in PDF format we have get many errors like this
There is no data for the field at position x
The problem is we searched many times about it we found that it may occurred due to missing data in dataset.
But we have about 30 datasets with a query to oracle database we cannot figure out which is the missing data and the log does not mention which report causes the error.
Is there a way to figure out which field is missing?
Or is there a way to enrich the reports error log to give us which report failed?
A sample of exact error is repeated with different positions :
processing!ReportServer_0-8!1e18!02/07/2022-09:56:36:: e
ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: , Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 29.;
Dears
I found a solution help me. I will share it.
The error is due to missing data, not missing values, which means the column name defined in the data set field has been changed in the database.
note:
When make the value equals null it will not give the same error; even if it is used in the report it will give a different error.
about how to detect ?
Simply install report builder in machine has connection to this database and open this report with report builder and make verify fields, it will give detailed error with the name of dataset fields not found so we tracked it in database we found it has been changed so fix it in either dataset or column name in database it fix the issue.
New challenge we are going to handle it either column name exist or changed to e, never get error and give empty report better as there is some database the report will connect may not have the same column names so it should give empty part of report instead of error.
thanks BR,
One of the tables in my DB has a BLOB column that stores images. So now I am setting up the page for this table. I have a bunch of IGs and such to process most of the data, but I set up a modal page to process the image.
The modal page gets the ID (which is the PK) into an item, and then it reads the image currently in the table into a 'Display Image' item. And I have a 'File browse...' item to upload new images.
Except I cannot get it to save.
I initially started with the display image item just having Setting Based on : BLOB column returned by SQL statement, as I couldn't get the source to work with the SQL query(Error Expected CHAR, source is BLOB), I managed to resolve this by putting automatic row processing on the page and then having the source be a column.
So now it displays well, with no errors.
But the save does nothing. I have tried saving by having the File browse reference the column and using automatic row processing, and there is just nothing. No errors pop up, but it just does nothing.
I have tried saving to APEX_APPLICATION_TEMP_FILES and then having a PLSQL DA or a PLSQL process to
SELECT blob_content
FROM APEX_APPLICATION_TEMP_FILES
WHERE name = :FILE_BROWSER_ITEM
And insert this into the table, but it just pops up a 'No data found' error.
I have gone through every bit of intel my google-fu has found, but I have failed to find a solution.
So I would appreciate any insight any of you might have.
Since noone answered, I stepped away from it for a bit and tried again at a later date. And now I made it work finaly.
I set up automatic row fetch and automatic row processing but disabled both of them, for some reason automatic row processing must be there so that you can have the source for the display image and file browse be the column.
Then I set the browse file to load into apex_application_temp_files.
and set up a process to be executed at page submit(but after the automatic row processing even though its disabled and shouldnt matter). The process executing the following code:
BEGIN
UPDATE MY_TABLE
SET MY_IMAGE = (SELECT blob_content
FROM apex_application_temp_files
WHERE name = :FILE_BROWSER_ITEM)
WHERE id = :ID;
END;
And I execute the page submit through a button with the action page submit and Database action being SQL UPDATE action.
I am guessing a fair bit of the things I did and have set up dont even matter, but I dont dare remove them for fear of breaking shit. What I have described here finaly works for me, and if you stumble upon this then you can try and I hope it works for you too, and you can try removing some of the disabled stuff and see if it still works.
When adding an image list to a dataset, I receive the following error:
"Incorrect padding. The transaction could not be committed. Please try
again."
The image list contains 410k items and takes 15+ minutes to import, so just trying again doesn't seem the right course of action.
What does "Incorrect padding" mean and what can I do about it?
It seems that your CSV has invalid format. What you can try is loading the CSV into a spreadsheet and making sure all paths start with gs:// prefix
We are working to upgrade our application to a more current version of Ruby & Rails. Our app integrates with a legacy database (SQL Server 2008 R2) that has a table with a column of image data type (we are unable to change this column to varbinary(max)). Previously we were able to save a binary into the image column. However now we are getting conversion errors.
We are working to upgrade to the following (among others):
Rails 4.2.1
ActiveRecord_SQLServer_Adapter (4.2.4)
tiny_tds (0.6.3.rc1)
freeTDS (v0.91.112)
When we now attempt to save into the image column, we get errors similar to:
TinyTds::Error: Unclosed quotation mark after the character string
Researching various issues within tiny_tds & activerecord_sqlserver_adapter, we decided to create a second table that matched the first but change the data type from image to varbinary(max). We can save a binary into the column.
The code causing the challenge is in a background job where we grab images from s3, store them locally and then push the image into the database. Again, we don't control the legacy database and thus can't change the data type (or confront the issue of why we are storing the image in the db in the first place).
...
#d = Doc.new
...
open("#{Rails.root}/cache/pictures/image.png", "wb") do |file|
file << open(r.image.url).read
end
#d.document = File.binread("#{Rails.root}/cache/pictures/image.png")
#d.save!
Given the upgrade has broken our saving images, we are trying to figure out how best to determine a fix. We could obviously roll back until we find a version that works. However we hope to find a fix. Anyone have any ideas?
Update:
We added the following configuration as we had triggers on the table being inserted: ActiveRecord::ConnectionAdapters::SQLServerAdapter.use_output_inserted = true
When we remove this configuration we get the following error:
TinyTds::Error: The target table 'doc' of the DML statement cannot have any enabled triggers if the statement contains an OUTPUT clause without INTO clause.
Note: We are unable to make any modifications to the triggers.
Per feedback on the ActiveRecord_SQLServer_Adapter site, we rolled back to 4.1.11 and we are now able to save into the image column.
We also had to add this snippet to overcome the issue with the triggers.
I've got a form in which I want to display either a download link for a BLOB or use a file browser field to do the same.
I can manage the file browser method normally, however because the BLOB I want to refer to isn't part of the table the form is based on, I can't seem to get it to show properly.
The best I've got so far is a 'display only' field with an SQL query returning the size of the file.
If you are using Oracle Application Express, you may youse the "P" procedure.
Just like the "F" procedure that you it to show pages, the P procedure allows you to download files from apex.
for example :
http://apex.shellprompt.net/pls/apex/p?n=217605020644166778
where that number at the end is the primary key number form the
select id
from apex_application_files
To add a file to this table, simple add a browse-file item on a page. Run the page, browse for a file, and submit the page. It will automatically insert it in this table.
you can query it after :
select id
from apex_application_files
where filename = YOUR_FILE
good post for this is : http://dgielis.blogspot.com/2007/08/oracle-apex-fp-pn-zp.html
test it out, tell me if you get stuck