heroku django Amazon S3. video recording and playing - django

everybody!
I have django app running on heroku server. I attached amazon S3 storage to it.
I want to allow users record webcam videos, then upload them (to S3 ultimately) and then other users to play them.
What is the easiest way to do that?
I have alteady spent more than 20 hours on research around this topic but I still have no idea.
People usualy uses some streaming servers like RED5 + flash players + something + something.. But it seems to be very compicated and is not appropriate for heroku as I understood...
I would appreciate any help!

The solution that most people go with is to use a flash applet to record into memory, and then bulk upload it at the end. Nimbb seems to be the one most people go with.
Alternatively, Youtube has a pretty neat API for doing recording and uploading to their servers entirely from within your site.
The YouTube Upload Widget lets your website's visitors perform both webcam and file uploads to YouTube. The support for webcam uploads sets the upload widget apart from the other uploading options that the YouTube API supports. The widget uses HTML5's postMessage support to send messages back to your website regarding videos uploaded via the widget.

Related

What is the best way to handle uploaded images in Django Rest Framework on production server and how to do it?

I'm relatively new to DRF, and I'm facing a problem where I have to set up many (thousands) of profile images. I need to do it "properly", so I guess uploading raw photos to the server is not the best way to do it. What service/ approach should I use?

updating an S3 website with zero downtime

What would be the best approach to update a static website hosted on an S3 bucket such that there is no downtime. The updates will be done by marketing teams of a company with zero knowledge of cli commands or how to move around in the console. Are there ways to achieve this without having to learn to move around in the console?
Edit
The web site is a collection of static html pages and will be updated using an Html editor. Once edited the marketing team will upload each individual updated file to the S3 bucket. There are no more than 10 such files including html and images. This was currently being hosted on a shared server and we now want to move it to an S3 bucket capable of hosting simple web pages. The preference is to not provision console access for certain users as they are comfortable only using a WYSIWYG html editor and uploading using an FTP client. The editors don't know html and the site doesn't use javascript. I am thinking of writing a batch script to manage the uploads to keep all that cli complexity away so they only work on the HTML in the editor. Looking for the simplest approach to achieve this.
You can set up a CI/CD pipeline as mentioned here
Update static website in an automated way
The above pipeline generally has a code commit as trigger. It depends what your marketing teams are doing on the content and how they are updating it. If they are updating the content that is hosted on AWS, you can change the trigger to S3 updates. The solution depends on an individual use case, which may require some dev from your side to make it simpler for your marketing teams.
I'm unsure what you are asking here, because to "update" a static website, surely you must have some technical knowledge in the very basics of web development.
It's important here to define what exactly you mean by update, because again, updating a website and updating a bucket are two completely different things.
Also, S3 has eventual consistency with PUT's (updating) so there will be some minimal downtime.
The easiest way to update an S3 bucket is via the console, and not the CLI. The console is pretty user friendly, and shouldn't take long to get used to.

Stuck in transition between BlueHost and AWS

I cant seem to figure out how to query database information from mysql through a php file uploaded to a server back to a xamarin application after something like a http post call.
I am relatively new to database development, however I had used bluehost for php storage and querying of my app information for about a year yet cant seem to do the same with AWS. I have a relatively decent scope of what S3, EC2, and RDS do for you, however none of them seem to do what I want. In essence I want to be able to say have someone click a button on my xamarin app that creates an event for a club. Then after they click said button, it will make an http post request to an aws site that contains a php file. In that php file it will change or grab the contents of a mysql database. The only place that seems to be able to upload files is S3 but you cant seem to directly utilize php files in the JSON format that I want. Am I completely off with why you would use AWS at all, am I close but not using the right tools, or is it something else? Please give a pretty extensive description on all that I would need to do as I have been struggling to find anything at all on this topic surprisingly.
So after some digging this video here walks through it a bit with phpstorm as your way to connect to the database and add webpages or api php files. Its pretty long but very helpful.

Allow large file upload from browser while navigating to other page

I'm building a website with Django 1.11 with a fairly simple javascript/html/css part (no framework like Vuejs). I have page reload on each navigation which is fine for my use case.
For convenience, I serve my website from App Engine Standard and it's going well so far. Now, I need my user to be able to upload files (up to 300MB size). Due to App Engine's limitation on request size (32MB), I'm using signed urls so I can send these files directly from my client's Javascript to Cloud Storage.
Due to the size of the files, the upload may take some time, but I can't seem to navigate to another page since it may cancel the upload. I understand that for a case like this a client app like single-page app in Vuejs for example would be appropriate but is there a way to achieve this with my current setup without rewriting my whole website (with possibly Vuejs and Django REST API)?
Any suggestions would be much appreciated.

Setting up a CDN with Wagtail CMS

I am looking into possibly setting up a CDN to use with my Wagtail sites. I am thinking that this will be a more efficient way to manage media uploads during stage/production pushes, since right now the media folder has to be manually copied from server to server on deploy. If all of the images were being accessed from a CDN then this wouldn't be an issue.
This would be my first time using a CDN so I'm looking for advice. There is lots of info on using a CDN with WordPress, but not a lot of documentation on setting one up with Wagtail/Django. I have the following questions about it:
Does anyone have any suggestions on the best way to implement the CDN with Wagtail?
How does it handle the uploads that the user submits through the CMS? Most of the images will be uploaded as part of the static files, but how does it work when the user uploads a photo as part of a post?
Which CDN companies have you had the best/worst experiences with? The sites I am planning to use this for are professional/business, but not e-commerce.
Also, if there is a more efficient way to handle the transfer of media uploads from one environment to another than using a CDN, I'd love to hear your suggestions for that too. As of right now I've had to copy the media folder over after doing the deploy, and I will have to do this every time I make a change to the site.
Thanks in advance for your assistance.
The following resources can be helpful for your required setup in Wagtail (later on today I can provide you some more details):
Frontend cache invalidator for pages (so not only for static and media files)
Link: http://docs.wagtail.io/en/latest/reference/contrib/frontendcache.html#frontendcache-aws-cloudfront
Storing media files in Amazon Web Services S3 buckets
This should be a better solution instead of copying media files from server to server. In this case Amazon Web Services CloudFront (CDN) would be a perfect choice.
Link: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#amazon-s3
More info CloudFront: https://aws.amazon.com/cloudfront/
Static file cache invalidation with Django Whitenoise
Can be relevant to clear the cache for a new deployment (the static files will have a unique filename so CDN will have a new file cache from its origin after the deployment)
Link: http://whitenoise.evans.io/en/stable/django.html
CloudFront from AWS will have my personal choice for CDN. Besides the awesome resources/services AWS has to offer, CloudFront is simple to setup and has one of the best CDN's out there.
Finally a CDN for serving static- and media files has nothing to do with Wagtail specifically. There are some (see list above) nice apps available for Django itself, but you are free to choose another CDN solution (like Cloudflare).
So setting up a AWS S3 Bucket for each environment (tst/acc/stg/prd) and use it for uploading you media files (so the files aren't on the server anymore) and setup a CloudFront distribution for these buckets would be a proper solution for your problem.
Best regards,
Rob Moorman