Is there a way to load a config file to my custom Chromium browser? - c++

I'm developing my own chromium based browser. At the moment I'm using huge amount of data loaded with command-line into my Chromium. The reason why I chose to load the configs using cmd is the accessibility of data stored in memory directly and being available in any place of Chromium project. Unfortunately that's not enough because I'm going to load the presets of data taking up to 1 mb or even more.
Is there convenient way to load my configuration file into memory once at the program start so I can access it later same as command-line? Thnx for your further help!!!
I did my try loading config into Chrome prefs key value pairs but I did not manage it to work to load these prefs later 😥

Related

What is the use of 'preview assets' in XCode 12?

This question is answered with "it contains assets for the preview canvas", which isn't enough information.
Does the preview assets folder give me any additional power over the preview canvas? If so, how can I utilize it?
Does the preview assets folder give me any additional power over the preview canvas? If so, how can I utilize it?
Preview Assets as it is seen below just by default registered development time only catalog of resources.
So you can store there any images, colors, files, ie any resources, which can be used in Preview Canvas only, for testing purpose. In example to not download one from internet, cloud, or fetch from database. Because Preview is for fast UI-only look & test, so data source is not important, so to test & tune UI you don't need to fetch external data but use locally stored test data.
You can add/name any other development time asset/folder in there as well.

Apex Oracle : How to get image URL?

I'm running Apex 19.2 on Oracle 18c and I would like to get some images URL to show them in the application. The images are stored in the database as blob (not static images).
For the moment what I did is creating an ORDS Restfull Service that connects to database and load the images. The images are then accessible via an URL that I insert in my pages
<img src="URL to my Restfull service module with the image identifier">
This works well but I find it quite complex and most importantly, it's very slow and doesn't cache the image. Whenever I load the page I have to wait for the image to load (even though it's very small : 50kb)
Does anyone have a solution for this please ? Is there any Apex out of the box solution like for static imaes ?
Thanks,
Cheers
There is no direct method to expose BLOBs to the end user as it would be kind of complicated to secure these files. I can suggest the following two methods:
Use the code just like you did it, but consider putting it in an application process. This way, you can use all your session variables directly. You will then be able to generate a link that does exactly what you want, or call the process from a button or branch. There is a nice tutorial here:
https://oracle-base.com/articles/misc/apex-tips-file-download-from-a-button-or-link
Using APEX_UTIL.GET_BLOB_FILE_SRC
This function only works out of a apex session and requires you to set up an application page with an item that holds a primary key to your table. I doubt that this is what you want.
Note that APEX_MAIL.GET_IMAGES_URL does not work for your use case - this only works for files in your shared components application files or workspace files.
I actually like your approach, because it may be more lightweight than 1). That the image gets loaded again every time probably does not depend on the method you are using. I guess it is more likely due to the headers you are sending out. Take a look at the cache-control headers on this page:
https://developer.mozilla.org/de/docs/Web/HTTP/Headers/Cache-Control
Maybe check out APEX_MAIL.GET_IMAGES_URL
It is supposed to do essentially what you need so perhaps you can use it.

Perforce (AWZ Server Lightsail Windows Instance) - Unreal Engine Source Control - Move Perforce Depot

I'll give a bit of a background as to the setup we have and why. Currently myself and a friend want to collaborate on an Unreal Engine Project. To do this I've set up an Amazon Lightsail Instance with Windows Server running. I've then installed Perforce onto this Server and added two users. Both of us are able to connect to this server from our local machines (great I thought!). Our goal was to attach two 'virtual' disks of 32gb to this server via Lightsails Storage option. I've formatted these discs and they are detected as Disk D and E on the Server. Our goal was to have two depots, one on Disk E and one on Disk D, the reason for this being the C disk was only 20gb (12gb Free after Windows).
I have tried multiple things (not got much hair left after this) to try and map the depots created to each HDD but have had little success and need your wisdom!
I've followed both the process indicated in this support guide (https://community.perforce.com/s/article/2559) via CMD as well as changing the depot storage location in P4Admin on the Server via RDP to the virtual disks D and E respectively.
Example change is from //UE_WIP/... to D:/UE_WIP/... (I have create a folder UE_WIP and UE_LIVE on each HDD).
When I open up P4V on my local machine I'm able to happily connect (as per screenshot) and set workstation to my local machine (detects both depots). This is when we're getting stuck. I then open up a new unreal engine file and save the unreal engine file to the the following local directory E:/DELETE/Perforce/Test/ and open up source control (See image 04). This is great, it detects the workspace and all is connecting to the server.
When I click submit to source control I get the following 'Failed Checking Source Control' when I try adding via P4V manually marking the new content folder for add I get the following 'file(s) not in client view.
All we want is the ability to send an Unreal Engine up to either the WIP Drive Depot or the Live Drive Depot. To resolve this does it require:
Two different workstations (one set up for LIVE and one for WIP)
Do we need to add some local folders to our directory? E:/DELETE/Perforce/UE_WIP & E:/DELETE/Perforce/UE_LIVE?
Do we need to tweak something on the Perforce Server?
Do we need to tweak something in Unreal Engine?
Any and all help would be massively appreciated.
Best,
Ben
https://imgur.com/a/aaMPTvI - Image gallery of issues
Your screenshots don't show how (or if?) you set up your local workspace (i.e. the thing that tells Perforce where the files are on your local workstation).
See: https://www.perforce.com/perforce/r13.1/manuals/p4v/Defining_a_client_view.html
The Perforce server acts as a layer of abstraction between the backend storage (i.e. the depots you've set up) and the client machines where you actually do your work. The location of the depot files doesn't matter at all to the client (any more than, say, the backend filesystem of a web server matters to your web browser); all that matters is how you set up the workspace, which is a simple matter of "here's where my local files are" (the Root) and "here's how my local paths map to depot paths" (the View).
You get the "file not in view" error if you try to add a local file to the depot and it's not in the View you've defined. The fix there is generally to simply fix the Root and/or View to accurately describe where you local files are. One View can easily map to multiple depots (as long as they're on a single server).
(edit)
Specifically, in your case, all of the files you're trying to add are under the path:
E:\DELETE\Perforce\Test\Saved\...
Since you've set up your workspace as:
Client: bsmith
Root: E:\DELETE\Perforce\bsmith
View:
//WIP/... //bsmith/WIP/...
//LIVE/... //bsmith/LIVE/...
then your bsmith workspace consists of these two local paths:
E:\DELETE\Perforce\bsmith\WIP\...
E:\DELETE\Perforce\bsmith\LIVE\...
The files you're trying to add aren't even under your Root, much less under either of the View mappings. That's what the "not in client view" error messages mean.
If you want to add the files where they are, modify your Root and View so that you define your workspace as being where your files are; if you want to have the files in one of the local directories above that you've already defined as being where your workspace lives, you'll have to move them there. If you put your files in bsmith\WIP, then when you add them they'll go to the WIP depot; if you put them in bsmith\LIVE, then they'll go to the LIVE depot, per your View.
Either way, once they're in your workspace, you can add them to the depot. Simple as that!

Physical file location in Pivotal Cloud Foundry

I am working with Spring Boot and have a requirement to interact with a legacy app using a .dll file.
The .dll file requires a configuration file which has to be in a physical location like C:/...
The .dll can only read only from a physical location; not like a relative path from the src folder.
I am able to successfully interact with the legacy app in localhost with the configuration file located in C:/, but when I have to deploy in PCF is there any possibility to read the configuration file from a physical directory location in PCF?
Like in WAS we can upload a file in the server and use its physical location in the code, can something like this be done in PCF?
You cannot upload files out of band or in advance of running your app. A fresh container is created for you every time you start/stop/restart your application.
As already mentioned, you can bundle any files you require with your application. That's an easy way to make them available. An alternative would be to have your app download the files it needs from somewhere when the app starts up. Yet another option would be to create a build pack and have it install the file, although this is more work so I would suggest it unless you're trying to use the same installed files across many applications.
As far as referencing files, your app has access to the full file system in your container. Your app runs as the vcap user though, so you will have limited access to where you can read/write based on your user permissions. It's totally feasible to read and write to your user's home directory though, /home/vcap. You can also reference files that you uploaded with your app. Your app is located at /home/vcap/app, which is also the $HOME environment variable when your app runs.
Having said all that, the biggest challenge is going to be that you're trying to bundle and use a .dll which is a Windows shared library with a Java app. On Cloud Foundry, Java apps only run on Linux Cells. That means you won't really be able to run your shared library unless you can recompile it as a linux shared library.
Hope that helps!
if both are spring boot jars, you can access them using BOOT-INF/classes/..
just unzip your jar and look for that config file and put up the address
once the jar is exploded in PCF, this hierarchy is maintained

How to use SQLite database from one platform(iOS) to another(Windows)

I don't know its a valid question or not.
I am working on one MFC/C++ application where
I want to use SQLite database from iOS application in my windows application.
My iOS database is encrypted using command sqlite_key.
While I am trying it for my windows application for the same database
It throws an exception for any operation on the database.
While Searching on Google I am not able to get right track for this.
Can anyone tell me is it possible?
And if yes please help me on this.
If your plan is to "export" it, i.e you want to reuse the data inserted by your ios application into your windows one, then you simply need to locate on your iphone the sqlite database file (sqlite store everything in one single location) and copy it on your computer, and tell your windows software the location of this file.
If your is to "share" the database, i.e both should be able to modify it in "realtime", then you will have to roll something on your own, as Sqlite3 does not provide any network support, it's just a library to read and write data in a file, in a SQL way.