Where is start point that custom storage for draw.io - draw.io

I want to add new storage such as gitlab/gitea/gogs for draw.io,
But it's hard for me to find the start point, just with basic javascript dev experiences.
I know should focus on webapp sources, but no idea about the

Related

Resource window of Google BigQuery

I am new to Google BigQuery.
I am now logged into bq console, but the resource window for tables and datasets is too small to navigate and is pinned into bottom left, and its not resizable, please help on how can i unpin the window to navigate it.
It looks like this.
Looks like what you want is something like below
There are many way to accomplish this. I will present two of them
Customize existing/available UI to fit your own specific needs using so called Bookmarklet
Bookmarklets are saved and used as normal bookmarks. As such, they are simple "one-click" tools which add functionality to the browser.
There is a wide usage of bookmarklets - one of which is to modify the appearance of a web page within the browser - which is exactly your use case.
You can create bookmarklet that will switch visibility of elements above the data navigator as in above example
if your needs are more sophisticated and you are looking for more perks than just hiding elements of UI - you can look into third party IDE for BigQuery
I can recommend one to check out - Goliath - the part of Potens.io Suite for BigQuery. You will find there everything you would expected from professional big data IDE tool.
It is free to use and is available on GCP Marketplace
Disclosure - I am part and lead of Potens.io Team (which is also clearly stated in my SO Profile)
Did you mean you want to unpin a project that has "Pin icon" in the screenshot?
If yes, You can unpin a project to clcik "UNPIN PROJECT" right side of bigquery console.
You have to click project(e.g bigquery-public-data) first to meet "UNPIN PROJECT".

Image processing on a web server

I want to run image processing algos on server which can interact easily with web apps. The image processing algos are compute heavy and wont be available in custom built libraries. Currently I am using Ruby on Rails on Heroku for my website.
What would be the best architecture to achieve this? take images from website - run image processing algo on it - display back on website
Most of my image processing code is on C/C++.
Can i call C/C++ code from Ruby on Rails directly? Is this possible on Heroku?
Or should I design a system where C/C++ code expose some APIs which can be called by Ruby on Rails server?
Heroku typically uses small virtual machine instances, so depending on just how heavy your processing is, it may not be the best choice of architecture. However, if you do use it I would do this:
Use a background task gem to do your processing. Have this running on a separate process (called a worker rather than a dyno in Heroku terminology). Delayed Job is a tried and tested solution for background tasks with a wealth of online information relating to integrating it into Heroku, but there are newer ones like Sidekiq which use the new threading system in modern versions of Ruby. They would allow everything to be done in the dyno, but I would say that it would be useful to keep all background processing away from the webserver dynos, so Delayed Job (or similar) would be fine.
As for integrating C/C++, I haven't needed to do this as yet. However, I know it is possible to create gems that integrate C or C++ code and compile natively. As long as you're using ruby rather than JRuby, I don't think Heroku should have a problem with them. There are other ways of accomplishing this, look at SO questions specifically about this topic, such as
How can I call C++ functions from within ruby
It seems that you need to create an extension, then create a gem to contain it. These links may or may not help.
http://www.rubyinside.com/how-to-create-a-ruby-extension-in-c-in-under-5-minutes-100.html
http://guides.rubygems.org/gems-with-extensions/
I recommend making a gem as I think it may be difficult to otherwise get libraries or executables on to a Heroku instance. You can store the gem in your vendor directory if you don't want to make it public.
Overall I would have the webserver upload to S3 or wherever you're storing the images (this can be done directly in the browser without using the webserver as a stepping stone with the AWS JS API. Have a look for gems to help.)
Then the webserver can request a background task to process the image.
If you're not storing them, things become a little more interesting... You'll need a database if you're using background tasks, so you could pass the image data over to the worker as a blob in the database perhaps.
I really wouldn't do all the processing just in the webserver dyno, unless you're really only hitting this thing very occasionally. With multiple users you'd hit a bottleneck very quickly.
The background process can set a flag on the image table row so the webserver can let the user know when processing is complete. (You can poll for information using JS on the upload complete screen using AJAX)
Of course, there are many other ways of accomplishing this, depending on a number of factors.
Apologies that the answer is vague, but the question is quite open-ended.
Good luck.

How do those who are not using a backend framework (such as Rails/Symfony/Django) go about developing and deploying an Ember application's assets?

More specifically, when using a backend application framework I generally am afforded some level of asset management which allows me to work with multiple files in development which are uncompressed and unminified and then in production mode those files become automatically minified, compressed, and concatenated into a single file.
I am looking to create an Ember application that is a single page app that interfaces with a separate RESTful services layer. I simply do not need the weight of a framework behind the Ember app and am hoping to serve it as static html+css+js, so I am looking for any guidance on how to easily manage development and deployment of a client-side only app without adding much overhead.
Right now my biggest issue is with including JS (and to a lesser extent, CSS) files. My HTML is static and I have an Ember app comprised of many files, so I have many script tags to include them all. This is clearly not appropriate for production so I imagine some kind of build tool will be needed to assemble my Javascript files and overwrite the script tags in the HTML file. Are there tools out there right now that will do this? Is there another approach that I may be overlooking?
This is my first fully client-side application so it's very possible that I just need to make a paradigm shift, having done server-side applications for so long.
Agreed this can be tricky without a backend framework. For sure script tags are not the way to go and you will need some kind of build tool for production deployment.
Ember App Kit is a solution a few of us have been working on. It's still early stages but i've used it for a couple of projects so far and it's been much better than trying to roll-my-own with grunt. I would expect it to become the default starting point for ember apps in near future, to try it now just download it as a zip then read the Getting Started Guide
There are many other solid solutions out there, consider checking out:
ember-tools
brunch-with-ember-reloaded
brunch-with-hapmsters
charcoal
I use a combination of requirejs and Grunt, using these lovely functions and this one, which can compile your ember-handlebars templates into functions. (The git-contrib includes the ability to watch for changes in your files and perform various build steps which may differ if you are in development or production. You can have separate grunt functions which run various tasks for production or development. Of course for all of this you are going to need node!

About Sitecore Backup

I am trying to backup a whole Sitecore website.
I know that the package designer can do part of the job, but not entirely.
Having a backup is always a good way when the site is broken accidently.
Is there a way or a tool to backup the whole Sitecore website?
I am new to the Sitecore, so any advise is welcome.
Thank you!
We've got a SQL job running to back-up the databases nightly.
Apart from that, when I deploy code and it's a small bit I usually end up backing up only the parts I'm going to replace. If it's a big code deploy I just back up the whole website (code-wise anyway) before deploying the code package.
Apart from that we also run scheduled backups of the code (although I don't know the intervals), and of course we've got source control if everything else fails.
If you've got an automated deployment tool you could also automate the above of course.
Before a major deploy of content or code, I typically backup the master database and zip everything in the website directory minus the App_Data and temp directories. That way if the deploy goes wrong, I can restore the code and database fairly quickly and be back to the previous state.
I have no knowledge of a tool that can do this for you, but there are a few ways you can handle this in an easy way:
1) you can create a database backup of the master database, but this only contains content and no files like media files that are saved on disk or your complete and build solution. It is always a good idea to schedule your database backup every night and save the backups for at least a week or more.
2) When you use the package designer, you can create dynamic pacakges that can contain all your content, media files and solution files on disk. This is an easy way to deploy the site onto a new Sitecore installation all at once, but it requires a manual backup every time.
3) Another way you can use is to serialize your entire content-tree to an xml-format on disk from the Developer tab. Once serialized, you can revert them back into the content tree.
I'd suggest thinking of this in two parts, the first part is backing up the application which is a simple as making sure your application is in some SCM system.
For that you can use Team Development for Sitecore. One of it's features allows you to connect a Visual Studio project to your Sitecore instance.
You can select Sitecore items that you want to be stored in your solution and it will serialize them and place them into your solution.
You can then check them into your SCM system and sleep easier.
The thing to note is deciding which item to place in source control, generally you can think of Sitecore items has developer owned and Content Editor owned. The items you will place in your solution are the items that are developer owned; templates, sublayouts, layouts, and content items that you need for the site to function are good examples.
This way if something goes bad a base restoration is quick and easy.
The second part is the backup of the content in Sitecore that has been added since your deployment. For that like Trayek said above use a SQL job to do the back-ups at whatever interval your are comfortable with.
If you're bored I have a post on using TDS (Team Development for Sitecore) you can check out at Working with Sitecore, Part Nine: TDS
Expanding bit more on what Trayek said, my suggestion would be to have a Continuous Integration (CI) and have automated deploy using Team City.
A good answer is also given here on Stack Overflow.
Basically in your case Teamcity would automatically
1. take back up of the current website (i.e. code) and deploy the new code on top of it.
2. Scripts can also be written to take a differential backup of the SQL databases, if need be.
Hope this helps.
Take a look at Sitecore Instance Manager module. Works really well for packaging entire Sitecore instance.

offline use of the google earth plugin

I have a use case that requires offline access to google earth. I know that google earth enterprise offers a disconnected product, however we may not have access to that product and/or google earth enterprise is prohibitively expensive at $25K for a dev license.
I would prefer to use the google earth plugin since I am building an application and would like to use the JS api. Is it possible to host the google earth plugin on my own disconnected server? We would use google earth connected to a standalone offline WMS server for access to imagery.
said another way, can I host the plugin and corresponding javascript on my own server?
I do not know if i understand well your problem but i can explain you waht I'm currently working on.
Im my current application with google earth plugin js api, I'm able to start the plugin even if offline. But one requirement is to have cached data.
If you have cached data and if you start the plugin offline, then zooming to a level with higher resolution that the one you have in your cached data will have no effect (imagery will not be update to higher resolution)
but depending on what you really need, yes , you can start the plugin offline
This is not really answering your original question but if you are interested, just tell me :-)
I tried to cache Google Earth with a proxy server but I couldn't.
Furthermore I think the api is validated every time it loads against Google Servers and doesn't allow offline use
It's some monthes now since I have worked with this.
I'll try to explain with what i can remember :-)
in the html where i have my plug-in, i have removed:
"script type="text/javascript" src="https://www.google.com/jsapi">
but i have saved locally this jsapi.js file. I also saved locally loader_1-008.js
then, im my code (c++, Qt) I'm using evaluateJavaScript(Qstring source) twice
where source is the text read from my 2 .js files
These 2 evaluateJavaScript calls need to be done before loading my html (the one with the plugin)
in my QWebView
I can not remmeber much more but I hope this can start to help you