I am wondering if it is possible to install Adobe After Effects as an 'application' in Azure batch. If so does anyone know how to do it?
I've not tried it yet. I did do some research around this though and it looks possible.
It appears we need:
A VM image (like these prebuilt images)
Containing aerender.exe
With a blank file called ae_render_only_node.txt
in C:Users/Public/Public Documents/Adobe (to enable non-royalty bearing mode)
It looks like a user did it here, though not specifically in Azure Batch.
Related
I've looked for this across the web a few times, and I feel like this hasn't been asked exactly, or I may just be getting bogged down with the wrong syntax. Hoping to get an easy answer here (yes, you can't get this, is an acceptable answer).
The variations from the base CentOS image are listed here: Link to GCP
However, they don't actually provide a download for this image. I'm trying to get a local VM running in VMWare with this image.
I feel as though they'd provide this to their clients to make it easier to prepare for use of their product, but I'm not finding it anywhere.
If anyone could toss me a link to a pre-configured CentOS ISO with the minor changes, I'd definitely take that as an alternative. I'm just not confident in my skills with Linux enough to configure the firewall properly :)
GCP doesn't support Google-provied images for exporting. However, they support exporting images for custom images.
I don't have any experience about image exporting, but I think this works.
Create custom images
You can create custom images based on your GCE VM instance.
Go navigation -> Compute engine -> images page.
You can create custom image via disk or snapshot in this page.
Select one and create a custom image.
Export your image
After creating custom image successfully, Go custom image page and click "export" on upper side.
Select export format and GCS destination. then click export.
Now you have an image in the Google Cloud storage.
Download image file and import to your local VM machine.
I have a Foswiki wiki on a server. Is it possible to script the following without FTP access (for various reasons I can't use it):
Download a topic's wikitext, modify it locally, then upload it again (overwriting the topic)
Upload wikitext to a new topic
I've been doing these tasks manually, but I'd like to automate them. I've looked into the Foswiki API and a few plugins, but nothing seems capable of doing this.
Is there a way? (any programming language)
If you have web access, you could drive the bin/view and bin/save scripts remotely from a script.
Take a look at our BuildContrib upload target for an example. It gets a strikeone key and downloads the original topic to recover any form data. It then uploads the topic text, creating a new version. It's written in perl, and uses LWP.
https://github.com/foswiki/distro/blob/master/BuildContrib/lib/Foswiki/Contrib/BuildContrib/Targets/upload.pm
The following isn't(!) the right solution (sure exists an nice Foswiki-way approach), but if you know perl, you can do anything with the:
Install Firefox
install MozRepl addon into it
Install the WWW::Mechanize::Firefox perl module
Now, you can script anything what you can do directly from the browser, e.g. logging into the Foswiki, click buttons, save topics, etc..etc. Drawback - it isn't an easy way - you need to know many details.
Myself using this technique for testing.
The Sitecore documentation provides some pretty clear instructions on how to configure a Sitecore instance as a processing server:
https://doc.sitecore.net/sitecore_experience_platform/xdb_configuration/configure_a_processing_server
However, many of those steps require enabling/disabling of files manually on the installed server. Has anybody seen or built a patch file (similar to SwitchMasterToWeb) that can disable/enable the appropriate functionality as a patch? I would rather not touch the default Sitecore install and instead rely on automated deployment of configuration patches.
I haven't seen this as a patch and not sure if its possible to do this with just one patch (would love to be proved wrong), but for something like this I've used a Powershell script.
I set up Octopus Deploy to run a Powershell script step after deploy to disable files and change settings if patch files can't do the job.
I can highly recommend the Powercore tools for this kind of thing.
https://github.com/adoprog/Sitecore-PowerCore/tree/master/Framework/ConfigUtils
If anybody else winds up looking for this, I've posted some work up on GitHub for patch files for a variety of versions for 8.0:
https://github.com/jst-cyr/Sitecore-Role-Configs
The patches there will do the 'disable/enable/change' for authoring, delivery, or processing. I don't have one for the reporting server.
Sitecore has evaluated POC for same. At this point of time applciable for Sitecore CMS 8.1 rev. 160302 (Update-2). See here-
https://github.com/Sitecore/Sitecore-Configuration-Roles
I am trying to backup a whole Sitecore website.
I know that the package designer can do part of the job, but not entirely.
Having a backup is always a good way when the site is broken accidently.
Is there a way or a tool to backup the whole Sitecore website?
I am new to the Sitecore, so any advise is welcome.
Thank you!
We've got a SQL job running to back-up the databases nightly.
Apart from that, when I deploy code and it's a small bit I usually end up backing up only the parts I'm going to replace. If it's a big code deploy I just back up the whole website (code-wise anyway) before deploying the code package.
Apart from that we also run scheduled backups of the code (although I don't know the intervals), and of course we've got source control if everything else fails.
If you've got an automated deployment tool you could also automate the above of course.
Before a major deploy of content or code, I typically backup the master database and zip everything in the website directory minus the App_Data and temp directories. That way if the deploy goes wrong, I can restore the code and database fairly quickly and be back to the previous state.
I have no knowledge of a tool that can do this for you, but there are a few ways you can handle this in an easy way:
1) you can create a database backup of the master database, but this only contains content and no files like media files that are saved on disk or your complete and build solution. It is always a good idea to schedule your database backup every night and save the backups for at least a week or more.
2) When you use the package designer, you can create dynamic pacakges that can contain all your content, media files and solution files on disk. This is an easy way to deploy the site onto a new Sitecore installation all at once, but it requires a manual backup every time.
3) Another way you can use is to serialize your entire content-tree to an xml-format on disk from the Developer tab. Once serialized, you can revert them back into the content tree.
I'd suggest thinking of this in two parts, the first part is backing up the application which is a simple as making sure your application is in some SCM system.
For that you can use Team Development for Sitecore. One of it's features allows you to connect a Visual Studio project to your Sitecore instance.
You can select Sitecore items that you want to be stored in your solution and it will serialize them and place them into your solution.
You can then check them into your SCM system and sleep easier.
The thing to note is deciding which item to place in source control, generally you can think of Sitecore items has developer owned and Content Editor owned. The items you will place in your solution are the items that are developer owned; templates, sublayouts, layouts, and content items that you need for the site to function are good examples.
This way if something goes bad a base restoration is quick and easy.
The second part is the backup of the content in Sitecore that has been added since your deployment. For that like Trayek said above use a SQL job to do the back-ups at whatever interval your are comfortable with.
If you're bored I have a post on using TDS (Team Development for Sitecore) you can check out at Working with Sitecore, Part Nine: TDS
Expanding bit more on what Trayek said, my suggestion would be to have a Continuous Integration (CI) and have automated deploy using Team City.
A good answer is also given here on Stack Overflow.
Basically in your case Teamcity would automatically
1. take back up of the current website (i.e. code) and deploy the new code on top of it.
2. Scripts can also be written to take a differential backup of the SQL databases, if need be.
Hope this helps.
Take a look at Sitecore Instance Manager module. Works really well for packaging entire Sitecore instance.
I have a use case that requires offline access to google earth. I know that google earth enterprise offers a disconnected product, however we may not have access to that product and/or google earth enterprise is prohibitively expensive at $25K for a dev license.
I would prefer to use the google earth plugin since I am building an application and would like to use the JS api. Is it possible to host the google earth plugin on my own disconnected server? We would use google earth connected to a standalone offline WMS server for access to imagery.
said another way, can I host the plugin and corresponding javascript on my own server?
I do not know if i understand well your problem but i can explain you waht I'm currently working on.
Im my current application with google earth plugin js api, I'm able to start the plugin even if offline. But one requirement is to have cached data.
If you have cached data and if you start the plugin offline, then zooming to a level with higher resolution that the one you have in your cached data will have no effect (imagery will not be update to higher resolution)
but depending on what you really need, yes , you can start the plugin offline
This is not really answering your original question but if you are interested, just tell me :-)
I tried to cache Google Earth with a proxy server but I couldn't.
Furthermore I think the api is validated every time it loads against Google Servers and doesn't allow offline use
It's some monthes now since I have worked with this.
I'll try to explain with what i can remember :-)
in the html where i have my plug-in, i have removed:
"script type="text/javascript" src="https://www.google.com/jsapi">
but i have saved locally this jsapi.js file. I also saved locally loader_1-008.js
then, im my code (c++, Qt) I'm using evaluateJavaScript(Qstring source) twice
where source is the text read from my 2 .js files
These 2 evaluateJavaScript calls need to be done before loading my html (the one with the plugin)
in my QWebView
I can not remmeber much more but I hope this can start to help you