I need to load data from excel to table . i.e. upload excel from apex application and then load the oracle database table. I have been advised to avoid plugin so am looking for oracle apex processed using which i can do this.
Any advice/steps/links to information would be very helpful.
Th file to be loaded is .xlsm with macros but the macros can be ignored.
Apex 20.2
There's nothing built-in for allowing your users to load an Excel file into your Apex app and store the data into a database table.
Personally, I've used Carsten Czarski's XLSX_PARSER package with good results. The source code is clean and I even enhanced it to parse for my custom needs.
https://blogs.oracle.com/apex/easy-xlsx-parser%3a-just-with-sql-and-plsql
I found these two good references via Google:
Jeff Kemp's great review of many possible solutions for various data formats, including Excel - https://jeffkemponoracle.com/2018/11/load-spreadsheet-data-into-apex/
Anton Scheffer's Excel2Collections package - https://github.com/antonscheffer/excel2collections
Yes, You can do it without even using the Data loading wizard provided by oracle apex, so here is the video, in which I have in detail explained how you can do it, with simple 2 steps.
Oracle APEX Upload Excel Data into Table or Collection in an Easy way without a Data loading feature
Link: https://youtu.be/ihJGYV0yDWE
Related
Using CSV upload in Apache Superset works as expected. I can use it to add data from CSV to a databse, e.g. Postgres. Now I want to apped data from a different CSV to this table/dataset. But how?
The CSVs all have the same format. But there is a new one for every day. In the end I want to have a dashboard which updates every day, taking the new data into account.
Generally, I agree with Ana that if you want to repeatedly upload new CSV data then you're better off operationalizing this into some type of process, pipeline, etc that runs on a schedule.
But if you need to stick with the uploading CSV route through the Superset UI, then you can set the Table Exists field to Append instead of Replace.
You can find a helpful GIF in the Preset docs: https://docs.preset.io/docs/tips-tricks#append-csv-to-a-database
Probably you'll be better served by creating a simple process to load the CSV to a table in the database and then querying that table in Superset.
Superset is a tool to visualize data, it allows uploading CSV for quick and dirty "only once" kind of charts, but if this is going to be a recurrent and structured periodical load of data, it's better to use whatever integrating tool you want to load the data, there are zillions of ETL (Extract-Transform-Load) tools out there (or scripting programs to do it), ask if your company is already using one, or choose the one that is simpler for you.
.Each interactive report data should come in each tab(i.e) sheet1, sheet2....
Example: In Apex page, I have 5 report region. Each report data should be downloaded in each sheet, In a single excel file.Please provide me a solution, Thanks in advance Karthick.
If you cannot upgrade your APEX environment then, as TineO said, probably AOP is the best option you have.
If you can upgrade to 19.2 or higher, then you can utilize APEX_REGION.OPEN_QUERY_CONTEXT API to get the records of the interactive reports. Then, you can use the Open Source library Alexandria and in particular, the xlsx_builder_pkg package to create the spreadsheets.
I am completely new to databases, was wondering if there were ways to direcly upload the data in an excel file, to ORACLE APEX, if not, what would the best way be to upload small datasets .CSV extension around 15MB.
Apex enables you to directly upload CSV files using the Data Load Wizard. You can find a lot of tutorials. Here is just one.
You can also upload Excel files using the following methods:
EXCEL2COLLECTIONS Plugin
Create a procedure that will translate excel data into strings. Tutorial here.
Using AS_READ_XLSX package. Tutorial here.
Don't be shy to use google because you will find many more options.
I have to migrate some legacy data from stand-alone sql server database to sharepoint list.
I'm going to use programmatic approach and write a code that communicates with sharepoint list asmx web service.
Are there some "data transformation wizards" to simplify such a task or a better approach to port legacy data from sql server database to sharepoint list?
Thank you in advance!
Being one time operation, I would not worrry about Best Practice but would consider what's the fastest way to do it.
You can use Excel 2010 (I have not tested it with Excel 2007) export data to Sharepoint 2010. Here are the high level steps:
Import data from SQL Server using DATA Tab in the ribbon
Excel would automatically create a TABLE
Now you can prepare the data for Export to Sharepoint. Here, you can remove unwanted columns, add new columns remove unwanted rows, arrange columns etc.
While being in the Table, access the "Export Table To Sharepoint List" functionality to publish you data to Sharepoint. More information about this is available at: http://office.microsoft.com/en-gb/excel-help/export-an-excel-table-to-a-sharepoint-list-HA010131472.aspx
It is quick! but let;s be aware of the limitations:
1. It cannot publish data to a list which already exists
2. It will not create a content type for the exported list. The columns are directly attached to the list.
If you want greater control over the migration, programming may be the way to go unless someone has a better idea in this great forum!
Can I use a web service as a data source for creating Excel pivot tables?
Currently, the soure data for the pivot table is being exported from our SQL db to a CSV file. Then, the CSV file is loaded into a worksheet. From there, a pivot table is created in the same workbook.
Customers login to a website, click some links, and an excel file (with data and pivot table) is generated. This is a public app so the preference is to not connect directly to the DB.
We control the database and generate the output. We are looking to streamline this process. The SQL db and pivot tables can not / will not change.
See http://www.vertex42.com/News/excel-web-query.html
What format does the "public-facing website" use in making the data available? A data file, a table on a web page? This issue will determine how much of a scraping operation you'll need to do.
You'll still need to write the web service and have it run on a server. A possible alternative is to use Yahoo Pipes to do the conversions for you.