Apex Oracle : How to get image URL? - oracle-apex

I'm running Apex 19.2 on Oracle 18c and I would like to get some images URL to show them in the application. The images are stored in the database as blob (not static images).
For the moment what I did is creating an ORDS Restfull Service that connects to database and load the images. The images are then accessible via an URL that I insert in my pages
<img src="URL to my Restfull service module with the image identifier">
This works well but I find it quite complex and most importantly, it's very slow and doesn't cache the image. Whenever I load the page I have to wait for the image to load (even though it's very small : 50kb)
Does anyone have a solution for this please ? Is there any Apex out of the box solution like for static imaes ?
Thanks,
Cheers

There is no direct method to expose BLOBs to the end user as it would be kind of complicated to secure these files. I can suggest the following two methods:
Use the code just like you did it, but consider putting it in an application process. This way, you can use all your session variables directly. You will then be able to generate a link that does exactly what you want, or call the process from a button or branch. There is a nice tutorial here:
https://oracle-base.com/articles/misc/apex-tips-file-download-from-a-button-or-link
Using APEX_UTIL.GET_BLOB_FILE_SRC
This function only works out of a apex session and requires you to set up an application page with an item that holds a primary key to your table. I doubt that this is what you want.
Note that APEX_MAIL.GET_IMAGES_URL does not work for your use case - this only works for files in your shared components application files or workspace files.
I actually like your approach, because it may be more lightweight than 1). That the image gets loaded again every time probably does not depend on the method you are using. I guess it is more likely due to the headers you are sending out. Take a look at the cache-control headers on this page:
https://developer.mozilla.org/de/docs/Web/HTTP/Headers/Cache-Control

Maybe check out APEX_MAIL.GET_IMAGES_URL
It is supposed to do essentially what you need so perhaps you can use it.

Related

Save and display images on Google Cloud App Engine

I am currently developping an Flask application that dynamically generated images. I save the image to static/img folder.
But the image is never changed after first time created.
Any body know what the issue behind.
many thanks.
It could be a caching issue (especially since you're saving to a static folder). Try appending a dummy query parameter e.g <your_url>/?123. If you see the new file, then it's a caching issue. One quick and dirty fix would be to generate unique values and append to the url or you can look up cache bursting techniques for GAE

Saving Interactive Bokeh Chart

I have created an interactive Bokeh chart with various widgets which allow manipulation of the data. I now want to understand what is the standard way of sharing such a plot or how do I save it for sharing.
The plot is created with the curdoc method and then output to the Bokeh server using session.show().
#create current visualization using plot p and widgets inputs
curdoc().add_root(HBox(inputs, p, width=1100))
#run the session
session = push_session(curdoc())
session.show() # open the document in a browser
session.loop_until_closed() # run forever
Does the app trigger actual python code?
If not, you might consider reworking it as a non-server standalone document (using CustomJS callbacks, for instance). That would just generate a self-contained static HTML file that you could publish or send anywhere, and have it be immediately accessible.
If your app does rely on executing actual python code to do the work, then it needs to actually be running somewhere for users to interact with it. First off, I would suggest you make a real app that runs in the server, like the ones in the demo app gallery (see also Use Case Scenarios in the User's Guide). A real server app, i.e. one you run like bokeh serve myapp.py, is definitely preferred over using bokeh.client, especially for "publishing" scenarios (it will also be simpler/less code and more performant). Then, distributing the app could mean a few things:
You give them the script and they run bokeh serve app.py locally themselves
You "deploy" the app by leaving it running on a server with a URL that is accessible to users who you want to be able to see it
Depending on how much compute the app does, and how many users you expect at a given time, the second option could be as simple as running bokeh serve app.py somewhere. But if there is heavy compute or you expect a lot of traffic, you may need more sophisticated "scale out" deployments behind a load balancer. More information is in Deployment Scenarios in the User's Guide, and of course we are happy to help wth more extended discussions on the public mailing list. Finally, I should mention that in the near future, automated scalable publishing of Bokeh applications will be available as a feature on https://anaconda.org/

offline use of the google earth plugin

I have a use case that requires offline access to google earth. I know that google earth enterprise offers a disconnected product, however we may not have access to that product and/or google earth enterprise is prohibitively expensive at $25K for a dev license.
I would prefer to use the google earth plugin since I am building an application and would like to use the JS api. Is it possible to host the google earth plugin on my own disconnected server? We would use google earth connected to a standalone offline WMS server for access to imagery.
said another way, can I host the plugin and corresponding javascript on my own server?
I do not know if i understand well your problem but i can explain you waht I'm currently working on.
Im my current application with google earth plugin js api, I'm able to start the plugin even if offline. But one requirement is to have cached data.
If you have cached data and if you start the plugin offline, then zooming to a level with higher resolution that the one you have in your cached data will have no effect (imagery will not be update to higher resolution)
but depending on what you really need, yes , you can start the plugin offline
This is not really answering your original question but if you are interested, just tell me :-)
I tried to cache Google Earth with a proxy server but I couldn't.
Furthermore I think the api is validated every time it loads against Google Servers and doesn't allow offline use
It's some monthes now since I have worked with this.
I'll try to explain with what i can remember :-)
in the html where i have my plug-in, i have removed:
"script type="text/javascript" src="https://www.google.com/jsapi">
but i have saved locally this jsapi.js file. I also saved locally loader_1-008.js
then, im my code (c++, Qt) I'm using evaluateJavaScript(Qstring source) twice
where source is the text read from my 2 .js files
These 2 evaluateJavaScript calls need to be done before loading my html (the one with the plugin)
in my QWebView
I can not remmeber much more but I hope this can start to help you

Django + S3 (boto) + Sorl Thumbnail: Suggestions for optimisation

I am using S3 storage backend across a Django site I am developing, both to reduce load from the EC2 server(s), and to allow multiple webservers (redundancy, load balancing) access the same set of uploaded media.
Sorl.thumbnail (v11) template tags are being used in our templates to allow flexible image resizing/cropping.
Performance on media-rich pages is not very good, and when a page containing thumbnails needing to be generated for the first time is accessed, the requests even time out.
I understand that this is due to sorl thumbnail checking/downloading the original image from S3 (which could be quite large and high resolution), and rendering/checking/uploading the thumbnail.
What would you suggest is the best solution to this setup?
I have seen suggestions of storing a local copy of files in addition to the S3 copy (not to great when a couple of server are being used for load balancing). Also I've seen it suggested to store 0-byte files to fool sorl.thumbnail.
Are there any other suggestions or better ways of approaching this?
sorl thumbnail is now created with remote slow storages in mind. The first creation of the thumbnail is however done quering the storage, for example first accessed from template, but after that the references are cached in a key value store. Still you need the first query and creation, well one solution is to use the low level api sorl.thumbnail.get_thumbnail with the same options when the image is uploaded. When the image uploaded add this thumbnail creation job to a que like celery.
You can use Sorlery. It combines sorl and celery to create thumbnails via workers. It's very careful not to do any filesystem access outside of the worker thread.
The thumbnail returned immediately (before the worker has had a chance) can be controlled by setting your THUMBNAIL_DUMMY_SOURCE to an appropriate placeholder.
The job is created the first time the thumbnail is requested, subsequent requests are served the dummy image until the worker thread completes.
Almost same as #Aidan's solution, I have made some tweaks on sorl-thumbnail. I also pre-generate thumbnails with celery. My code is here sorl_thumbnail-async
But I came to know easy_thumbnails does exactly what I was trying to do, so I am using it in my current project. You might find useful, short post on the topic is here
The easiest solution I've found so far is actually this third party service: http://cloudinary.com/

Programmatically accessing sitecore layouts, templates and moving it to other site

We have a need to programmatically access the layouts/ templates of one sitecore site and move it to another site under different folders basically the intent is to restructure the existing site which is already in production.
Could anyone tell me how do we go about it?
Instead of writing a custom "one-time-use" tool for this, I would recommend you to get advantage of a standard "Transfer Items" application. You can find it in Sitecore Control Panel: go to Database > Transfer Items to Another Database.
So, what you basically need to do:
plug in the master database from the target new site to this older site, like "master_new" or something. This will require the web.config modification. The section on SDN about publishing targets should have a guideline how to do this
run the "Transfer Items" application, select the templates / layouts you needs on the first page, select this "master_new" database as a target database and the place in the content tree to transfer to on the second one
run the actual transfer
If your layouts / templates are groups into folders, this process will take the minimal time - much less comparing to creating your custom script...
UPDATE: Some sample code how to trigger this application programmatically:
UrlString url = new UrlString(UIUtil.GetUri("control:TransferToDatabase"));
Context.ClientPage.ClientResponse.ShowModalDialog(url.ToString());
I would look into using Sitecore Powershell Extensions (look at marketplace). It is a perfect fit for a usecase like this.