Is it possible to copy a file from Sharepoint to s3? Preferably coding it from the AWS side.
I've searched but not seeing much out there. There's a similar title but this doesn't answer the question:
upload files from sharepoint online to aws s3 bucket
It is possible for sure. SharePoint online has a rest API I use a python package called office365, it implements a SharePoint client to handle most of the daily operations you will need.
The repo is: https://github.com/O365/python-o365
Some tips about I have struggled for the first time:
The ClientContext object requires the base site URL for the SharePoint library you want to authenticate, library doc:
https://mysharepoint.mydomain.com/sites/mysite/shareddocuments/
The URL you must pass to the Client context will be: https://mysharepoint.mydomain.com/sites/mysite
The method UserCredential requires your user in the following format: user#mydomain
I need your help regarding on serving my webapp on a server:
I have a nodejs app and I have it running in ExpressJs in beanstalk and it is working fine,
However I am thinking to use s3 instead. So in my website I have all sort of ajax calls and also I have couple of modules in nodejs folder which is generated via npm install.
Here is the link I used in aws:
http://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html
But it is not very clear about it.
I am thinking if it is doable to use s3 for this website?
Also can you please provides some pros and cons? I appreciate it because this will help me a lot on making decision whichone to choose?
I'm wondering how you can sync your Postman config with a git repository.
I know you can export and import from Postman to a folder - which is OK - but I wondered if there was something more effortless.
I'm not exactly sure how you're trying to use this, but a few options would be:
First Option
to use their addon cli called newman. You can run collections from a URL or Local file with newman using
newman run http://some.url.here
Then if you make the remote url a part of a git repository it would obviously update/change with each commit/pull
Second Option
Try this with extreme caution and only if you feel comfortable with the process, also this may not be compliant with their terms of use so I don't suggest you try it without first some research
If you can find the directory in which the Postman collections are held, you could create a hard link with the command line from a git repository on your machine to the directory or specific file you need to link. Whenever you change the source file the one in the Postman config will change.
The way in which you accomplish this will depend on the system you use and version of Postman.
In addition to exporting and cloud syncing as mentioned in the other answers, there's some other options too.
Postman added a Git sync in Postman app v9 so you can manage version control with forking, merging, and pull requests.
There are also built-in integrations to sync your Postman collections with GitHub, with GitLab, and other services for version control. These integrations are for users on the paid plans.
Postman also has an API so you can GET and run the latest version of your collection, environment, or whatever using Newman or continuous integration tools or to build your own integration.
Postman is not designed for that case. They offer a cloud service which keeps you and your collaborators in sync. You can try their cloud plan for 30 days for free. Check here: https://www.getpostman.com/cloud_trial_faq
You can use Postman integrations (Home > Integrations) to link Postman to your remote git repository.
The following article explains how to integrate your gitlab repo to Postman:
https://learning.postman.com/docs/integrations/available-integrations/gitlab/
Also you can use Postman API versionning to do something similar:
https://learning.postman.com/docs/designing-and-developing-your-api/versioning-an-api/
For non-free plans, Postman now (version 9 and up) supports automatic sync of collections with a git repository on several popular git services.
(Again, it's currently only available for paid plans)
See the documentation for how to integrate Postman with GitHub, GitLab and Bitbucket.
The process is roughly:
create a dedicated repo on your git provider (e.g. my-postman-collections-repo)
create a personal access token for the provider (e.g. GitHub) with the expected scope (e.g. repo and user)
define an integration (using postman UI) for each collection you want to be kept in sync
I'm working with the GitHub integration and it works great.
How do I download the artifacts from a Jenkins build to my local machine using Python?
I will assume that you want to download via http.
You might want to use GNU wget for example, but if you really want to use Python - check out How do I download a file over HTTP using Python?. Urllib2 provides an easy way to handle http requests.
This is providing that you will not need to perform any additional actions to get to the file (authentication etc.).
I am doing simple app which has to connect to external REST api, get data in json and print it for the user. My questions are:
Where should I put/create a module which will connect to external REST api? I mean I could just write some code in views.py which connects to REST api and the passes the results to template but I want to separate it in some autonomic module which I could use in views.py like myapimodule.get_devices() which for example will connect and get data from example.com/api/device/get. I tried python and django for first time today so I just want to know where u put (and how) such modules in django app?
How I can connect to RESTAPI with django? I have for example username and password for http auth and address like example.com/api/device/get - what parts of django lib will allow me to use the restapi and retrieve data from it (in json format)?
There's really no "right" answer. It just depends on what's best for your needs.
To connect to an external REST API, take a look at the excellent Requests library.
The Requests library is worth learning. It will save you a lot of grief.
Did you try:
pip install chardet
before
pip install requests
?