Wipe an imported API in Postman - postman

I am importing a complex JSON schema with several APIs declared. Postman has been an essential tool to test these APIs as I develop them, but I am iterating a lot and it is becoming very cumbersome to manually delete every API, which requires the typing in of every API name.
Short of a new feature (say "reimport JSON schema folder"), is there a way for me to delete a file to reset my Postman back to its original installed state? I'm on MacOS. I have tried deleting a few folders I thought would do the trick, but they keep getting recreated with my former APIs intact.

Related

Superset loading examples don't seem to disable

I am running Superset in docker. At first the example datasets, charts and so on were loading. After some time I decided to disable examples.
I changed the configuration to SUPERSET_LOAD_EXAMPLES=no in the .env file. I also tried to delete this key from .env. However, examples don't seem to disappear. How can they be deleted completely?
If you've run Superset once with SUPERSET_LOAD_EXAMPLES=yes, that will populate the example data and charts/dashboards into your metadata database alongside any actual charts and data. My understanding from searching the codebase is that there's no way to undo this.
If you really want those examples gone, you can start fresh from a new metadata db, but that would mean discarding any content you've created. Or you can try deleting the examples manually, either in the UI or in the database backend.
If you can't start fresh, my personal advice is to just ignore them. Eventually they'll get pushed to the bottom by your new content, and it's nice to have them in case you need to file a reproducible bug report example. I tried deleting some of them from the database backend and all I did was corrupt them so I get 500 errors when I try to load the examples.

Is there a programmatic way to export Postman Collections?

I have an ever-growing collection of Postman tests for my API that I regularly export and check in to source control, so I can run them as part of CI via Newman.
I'd like to automate the process of exporting the collection when I've added some new tests - perhaps even pull it regularly from Postman's servers and check the updated version in to git.
Is there an API I can use to do this?
I would settle happily for a script I could run to export my collections and environments to named json files.
Such a feature must be available in Postman Pro, when you use the Cloud instance feature(I haven't used it yet, but I'll probably do for continuous integration), I'm also interested and I went through this information:
FYI, that you don't even need to export the collection. You can use Newman to talk to the Postman cloud instance and call collections directly from there. Therefore, when a collection is updated in Postman, Newman will automatically run the updated collection on its next run.
You can also add in the environment URLs to automatically have Newman environment swap (we use this to run a healthcheck collection across all our environments [Dev, Test, Stage & Prod])
You should check this feature, postman licences are not particularly expensive, it can be worth it.
hope this helps
Alexandre
You have probably solved your problem by now, but for anyone else coming across this, Postman has an API that lets you access information about your collections. You could call /collections to get a list of all your collections, then query for the one(s) you want by their id. (If the link doesn't work, google "Postman API".)

Foswiki: Uploading and downloading topics without FTP

I have a Foswiki wiki on a server. Is it possible to script the following without FTP access (for various reasons I can't use it):
Download a topic's wikitext, modify it locally, then upload it again (overwriting the topic)
Upload wikitext to a new topic
I've been doing these tasks manually, but I'd like to automate them. I've looked into the Foswiki API and a few plugins, but nothing seems capable of doing this.
Is there a way? (any programming language)
If you have web access, you could drive the bin/view and bin/save scripts remotely from a script.
Take a look at our BuildContrib upload target for an example. It gets a strikeone key and downloads the original topic to recover any form data. It then uploads the topic text, creating a new version. It's written in perl, and uses LWP.
https://github.com/foswiki/distro/blob/master/BuildContrib/lib/Foswiki/Contrib/BuildContrib/Targets/upload.pm
The following isn't(!) the right solution (sure exists an nice Foswiki-way approach), but if you know perl, you can do anything with the:
Install Firefox
install MozRepl addon into it
Install the WWW::Mechanize::Firefox perl module
Now, you can script anything what you can do directly from the browser, e.g. logging into the Foswiki, click buttons, save topics, etc..etc. Drawback - it isn't an easy way - you need to know many details.
Myself using this technique for testing.

About Sitecore Backup

I am trying to backup a whole Sitecore website.
I know that the package designer can do part of the job, but not entirely.
Having a backup is always a good way when the site is broken accidently.
Is there a way or a tool to backup the whole Sitecore website?
I am new to the Sitecore, so any advise is welcome.
Thank you!
We've got a SQL job running to back-up the databases nightly.
Apart from that, when I deploy code and it's a small bit I usually end up backing up only the parts I'm going to replace. If it's a big code deploy I just back up the whole website (code-wise anyway) before deploying the code package.
Apart from that we also run scheduled backups of the code (although I don't know the intervals), and of course we've got source control if everything else fails.
If you've got an automated deployment tool you could also automate the above of course.
Before a major deploy of content or code, I typically backup the master database and zip everything in the website directory minus the App_Data and temp directories. That way if the deploy goes wrong, I can restore the code and database fairly quickly and be back to the previous state.
I have no knowledge of a tool that can do this for you, but there are a few ways you can handle this in an easy way:
1) you can create a database backup of the master database, but this only contains content and no files like media files that are saved on disk or your complete and build solution. It is always a good idea to schedule your database backup every night and save the backups for at least a week or more.
2) When you use the package designer, you can create dynamic pacakges that can contain all your content, media files and solution files on disk. This is an easy way to deploy the site onto a new Sitecore installation all at once, but it requires a manual backup every time.
3) Another way you can use is to serialize your entire content-tree to an xml-format on disk from the Developer tab. Once serialized, you can revert them back into the content tree.
I'd suggest thinking of this in two parts, the first part is backing up the application which is a simple as making sure your application is in some SCM system.
For that you can use Team Development for Sitecore. One of it's features allows you to connect a Visual Studio project to your Sitecore instance.
You can select Sitecore items that you want to be stored in your solution and it will serialize them and place them into your solution.
You can then check them into your SCM system and sleep easier.
The thing to note is deciding which item to place in source control, generally you can think of Sitecore items has developer owned and Content Editor owned. The items you will place in your solution are the items that are developer owned; templates, sublayouts, layouts, and content items that you need for the site to function are good examples.
This way if something goes bad a base restoration is quick and easy.
The second part is the backup of the content in Sitecore that has been added since your deployment. For that like Trayek said above use a SQL job to do the back-ups at whatever interval your are comfortable with.
If you're bored I have a post on using TDS (Team Development for Sitecore) you can check out at Working with Sitecore, Part Nine: TDS
Expanding bit more on what Trayek said, my suggestion would be to have a Continuous Integration (CI) and have automated deploy using Team City.
A good answer is also given here on Stack Overflow.
Basically in your case Teamcity would automatically
1. take back up of the current website (i.e. code) and deploy the new code on top of it.
2. Scripts can also be written to take a differential backup of the SQL databases, if need be.
Hope this helps.
Take a look at Sitecore Instance Manager module. Works really well for packaging entire Sitecore instance.

How to monitor an FTP upload directory in coldfusion without using event gateways?

Having spent a couple of hours coding an event gateway solution, I discover that they are not supported by CF standard edition. Buggerit! So back to the drawing board.
I can see how I can check the folder's dateLastModified attribute using cfdirectory and so I can run a scheduled task to see when a new file has been uploaded, but whats the best way of storing/comparing the file list so as to get a list of just the ones added since last check.
General hints/links appreciated
Assuming that, for whatever reason, you can't use a gateway, the simplest soluition that springs to mind is to move files you've procesed to a separate directory. Then, your scheduled task can just deal with files in the FTP directory itself.
they are not supported by CF standard
edition
Are you still using CF7? It has been supported by CF Standard Edition since CF8
As #Henry pointed out, you can use an Event Gateway.
If you decide not to use that approach, I'd suggest a ColdFusion scheduled task. Most foolproof algorithm for that task is storing the results of the last <cfdirectory/> call either in a persistent scope - application or server - or writing it out to a database or file (e.g. WDDX). Reason to hold on to all this information, rather than just a timestamp, is handling situations where newly added or changed files do not take on the correct timestamp for whatever reason (system clock off comes to mind).
If you use a database to capture the data you could use a MINUS/EXCEPT query in SQL Server or Oracle, respectively, to determine what's new. Else you'll need to do perform some nested looping in ColdFusion over the old a new queries to generate the list of new files.