How do I get Google Cloud Source file view to offer the EDIT button? - google-cloud-platform

The GC docs say
and show
but I get no EDIT button
How do I get the EDIT button?
Setup is

Thanks, ChrisJJ. As we head towards GA for Cloud Source Repositories, we're trimming out underused and half-baked features, of which this is both. It's particularly half-baked because you can't use it to create new files or folders, move files or folders around, delete files, keep files in sync with the cloud shell, etc.
So, we've pulled this feature (and are updating the docs appropriately). However, if you'd like to edit your files on the web, you can do so with the Cloud Shell directly (via nano, vi or emacs) or you can use the new code editor feature described here: https://cloudplatform.googleblog.com/2016/10/introducing-Google-Cloud-Shels-new-code-editor.html
I think you'll find that this is a MUCH more full-featured editor experience and we're continuing to look at ways to make it even better.

Related

`gsutil equivalent` missing in GCP User Interface

I am doing a tutorial on Google Certified Associate Cloud Engineer 2020, which used to be on Udemy and now is on Cloud Guru. I am watching a video on GCS: Google Cloud Storage.
At one point tutor, while using GCP User Interface, is renaming a file. In the window Rename Object, a great feature shows gsutil equivalent.
This gsutil equivalent is not showing on my GCP User Interface. Is there any option to turn this on, or is this a feature that no longer exists?
I have tried to look at different options in User Interface, but I cannot find the option I am looking for. I have tried to Google this, but most things that come up are more related to gsutil itself rather than User Interface.
Related to your question if you have to activate something to be able to get this feature, the answer is that you don’t have to activate anything as there is no way to activate it because this is a feature from the GCP UI interface that has been changed since the video that you used as a reference was released.
If you want to get the same gsutil command you would be able if you click on move option instead of using rename. This will open another window where you would find the same gsutil command as you found in the image that you shared.
The reason why the same command is present in the move option as it was in the rename is because in the end a rename the same as a move, which is in fact a 2-step process: a copy and a delete, as can be seen in the steps to rename using the REST API as described in the docs.
In the case that you want this feature to be again available on the GCP UI you can always open a Feature Request in the Issue Tracker asking for it.
Rename feature is also available in GCP Console Just Chek following screenshots
Check This :
https://i.stack.imgur.com/b8pyW.png

How do I move my Firebase directory in my Google Cloud Shell/Instance?

some time ago I set up Firebase Functions & Hosting (never used the Hosting but extensively using Functions) to work with my Android App that I'm building. When I set this up, I didn't know very much about the environment and didn't realize "where" I was creating the directories. I ended up creating the directory under /home/myusername/firebase. While I still don't know much more (lol) about the environment, I came to realize that other users in the project are unable to see these directories, because they are in my home/user directory. (right?) I now have another user in the project, that needs to be able to access these directories.
I know that I can probably "easily" just move these directories with the mv command, but is that the correct/proper way? I assume it is not.
How do I go about safely moving these directories to a higher level directory --- OR: How do I go about putting these directories in a place where other users can access them, as well?
Thank you for any help and guidance.
It sounds like you are overthinking things. Just move files around with mv like you would normally. Cloud Shell doesn't really operate much differently than your desktop with respect to moving files and directories around.
If you want to share your source code, you should use some form of source control, such as GitHub or Google Cloud Source Repositories. Cloud Shell is not a good place to share data.

Can I add a Google Trends graph to Google Data Studio?

I would like to add a Google Trends chart for a specific search term to my Google Data Studio report, but Trends is not an option in the Data Source list. I wasn't able to find an option to embed JavaScript either. Is it possible to add a Trends chart to a report in Data Studio? Thanks!
I am posting this workaround as it seems no similar solution has been provided since.
You can actually do this, using a small workaround:
Create the graph you want to embed using Google Trends.
Click the "embed" icon in the upper right corner of the graph, and copy the JS-code (for either desktop or mobile device)
Create a simple empty HTML-file using notepad or similar text editor. (including , , as per common standard). Place it in an empty folder on your hard drive.
Paste the Google Trends-embed code into the section of your HTML-file.
Go to https://app.netlify.com/drop and upload the whole folder (including your .html-file). Copy the direct link provided by Netlify. (note: Any other form of public hosting should work fine, this is just my personal preference)
In Google Data Studio, click "URL embed" and paste your direct link.
Voila!
(Note: As this is a direct graph link and not a data feed, it, unfortunately, won't let you filter or change settings. but if configured wisely before copying the embed code, should do the trick for any time range, year-on-year or similar needs.)
Hope this helps someone :)
You can use supermetrics.com that has a google trends (free) datasource and then import a common sheet into your dashboard, the only problem is that you wont be able to change the date range, meaning its only "one way"
Unfortunately, the Google Trends data connector has stopped working in Supermetrics. They use an unofficial Google API that has been faulty lately.
The connector was removed Dec 2018.

Foswiki: Uploading and downloading topics without FTP

I have a Foswiki wiki on a server. Is it possible to script the following without FTP access (for various reasons I can't use it):
Download a topic's wikitext, modify it locally, then upload it again (overwriting the topic)
Upload wikitext to a new topic
I've been doing these tasks manually, but I'd like to automate them. I've looked into the Foswiki API and a few plugins, but nothing seems capable of doing this.
Is there a way? (any programming language)
If you have web access, you could drive the bin/view and bin/save scripts remotely from a script.
Take a look at our BuildContrib upload target for an example. It gets a strikeone key and downloads the original topic to recover any form data. It then uploads the topic text, creating a new version. It's written in perl, and uses LWP.
https://github.com/foswiki/distro/blob/master/BuildContrib/lib/Foswiki/Contrib/BuildContrib/Targets/upload.pm
The following isn't(!) the right solution (sure exists an nice Foswiki-way approach), but if you know perl, you can do anything with the:
Install Firefox
install MozRepl addon into it
Install the WWW::Mechanize::Firefox perl module
Now, you can script anything what you can do directly from the browser, e.g. logging into the Foswiki, click buttons, save topics, etc..etc. Drawback - it isn't an easy way - you need to know many details.
Myself using this technique for testing.

Virus scan for files being uploaded to Sitecore

Are there any best practices on virus scanning all files being uploaded to the Sitecore media library (and ultimately stored in Sitecore's DB)?
I searched all over the web but there is too much noise caused by the word virus since many people seem to have performance issues on server that have anti-virus software installed.
I don't know if it is an established best practice, but I would probably add a processor for the uiUpload pipeline that used an API or command line process for a commercial antivirus product. Other than the fact that it is in a pipeline processor, it shouldn't really be much different from how you would do it in any other ASP.NET application. Performance will definitely be a concern, but you could create a dialog with a psuedo progress bar to give some feedback to the user.
Take a look at this post by Mike Reynolds. It may help you out:
http://sitecorejunkie.com/2013/11/09/perform-a-virus-scan-on-files-uploaded-into-sitecore/
I am not aware of any published best practices, but if you are able to add a step in the upload process, you might want to take a look at Metascan, which provides API level integration to multiple antivirus engines. Using this, you could build a workflow for those uploaded files to scan them prior to them hitting your Sitecore media library by establishing rules based on the results of the antivirus engines used in your Metascan deployment. There's also a hosted version at metascan-online(dot)com
Disclaimer /// I am an employee of OPSWAT, who produces Metascan, but it appears to be a potential solution to your issue
In one of our recent Projects, we were faced with a requirement to scan incoming files for virus. The problem in the project was that the files after begin uploaded, were made public available on the website.
The way we solved the problem was to implementing https://www.virustotal.com/. Its a free online virus scanner that has a public API. You can send files via SSL.
We implemented the solution by adding newly uploaded files to a Sitecore workflow. The workflow would handle the scanning of files, and move the files to the final stage of the workflow, if the files wasn't infected. If a file was infected, the file would be deleted.
A Scheduler is running every 5 minutes to check for new incoming files with the workflow.
This also means that the files aren't available straight away, as the scheduler has to check the file, but you should be able to implement the functionality directly when the user has uploaded the file, by adding your custom code to the upload pipeline.