Increase Attach Document MB Size vtiger 6.4 - vtiger

I want to attach documents which will be more than 10 to 15 mb. Please suggest me how to increase the file upload size in MB in the entire system. I am using vtiger 6.4. Thanks in advance.

Please set below value in config.inc.php file
$upload_maxsize = 15728640;

Related

Issue with reading millions of files from cloud storage using dataflow in Google cloud

Scenario: I am trying to read files and send the data to pub/sub
Millions of files stored in a cloud storage folder(GCP)
I have created a dataflow pipeline using the template "Text files on cloud storage to Pub/Sub" from the pub/sub topic
But the above template was not able to read millions of files and failed with the following error
java.lang.IllegalArgumentException: Total size of the BoundedSource objects generated by split() operation is larger than the allowable limit. When splitting gs://filelocation/data/*.json into bundles of 28401539859 bytes it generated 2397802 BoundedSource objects with total serialized size of 199603686 bytes which is larger than the limit 20971520.
System configuration:
Apache beam: 2.38 Java SDK
Machine: High performance n1-highmem-16
Any idea on how to solve this issue? Thanks in advance
According to this document (1) you can work around this by modifying your custom BoundedSource subclass so that the generated BoundedSource objects become smaller than the 20 MB limit.
(1) https://cloud.google.com/dataflow/docs/guides/common-errors#boundedsource-objects-splitintobundles
You can also use TextIO.readAll() to avoid these limitations.

Djanog limit image upload size

I am attempting to limit the size of an image that can be uploaded. To do this in the docs I found DATA_UPLOAD_MAX_MEMORY_SIZE I set the value of it to 3mb (3145728 bytes) in my settings.py file but I am still able to upload files larger than 3 mb. I also tried FILE_UPLOAD_MAX_MEMORY_SIZE and the same thing occurred. The only way i can get it to trigger is if i set it to a very low value such as 1 or 2. Any ideas on what I'm doing wrong.
From the docs for DATA_UPLOAD_MAX_MEMORY_SIZE, it does not include uploaded files.
The check is done when accessing request.body or request.POST and is calculated against the total request size excluding any file upload data.
FILE_UPLOAD_MAX_MEMORY_SIZE defines when an uploaded file is saved to the filesystem instead of staying in memory, it does not impose any limits on how large the uploaded file can be
Your best bet is to configure your webserver to limit upload size, client_max_body_size if you are using nginx for example

What is the Maximum file size for using multipart upload in s3?

Any one have idea on What is the Maximum file size for using multipart upload in
s3?
because when i tried to upload 10 gb file its got stuck. There is no error message in the log.
can any one have a idea
Thanks in advance
The maximum size of an object you can store in an S3 bucket is 5TB so the maximum size of the file using multipart upload also would be 5TB.
Using the multipart upload API, you can upload large objects, up to 5 TB.
The multipart upload API is designed to improve the upload experience for larger objects. You can upload objects in parts. These object parts can be uploaded independently, in any order, and in parallel. You can use a multipart upload for objects from 5 MB to 5 TB in size.
Official documentation- http://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html

What is the large a.cfm file located within the CFIDE images folder

What is this a.cfm inside the CFIDE images folder C:\inetpub\wwwroot\CFIDE\images ? We are running ColdFusion 9. This file is using a large amount of storage space. I would say about 1+ MB per minute, 1-2 GB per day and I have to delete the file to get the storage freed up. But the a.cfm file is getting automatically created after some time.
Does anybody know what this a.cfm file is and it's purpose?
How can we stop the a.cfm file from taking up so much storage space?
Thanks for your comments. I found what was causing and fixed it. There was a DSN for a SQL DB with the option 'Log activity' enabled. And the Log of the database was saving to C:\inetpub\wwwroot\CFIDE\images\a.cfm. I have disabled 'Log activity' and the issue got resolved!

How to check Allowed Attachment File Size in Amazon SES?

I am using Amazon SES for sending mails in a custom PHP project. I am facing a couple of issues.
1) The amazon ses allows me to send small sized pdf files. Where i can change the file size limit? I am unable to find it.
2) The amazon ses just allows pdf files to be sent. Whenever I try to send any other file type it says illegal file name. Please tell me how to fix this?
Thanks in advance.
Any help would be highly appreciated.
AWS SES mail size limit is 10MB. It will allow PDF's and many other file types, but there are restrictions.
You can read more here: http://aws.amazon.com/ses/faqs/#49
If you need to send a restricted file type, you can rename the file before it goes out and the recipient would have to know enough to rename it when it arrives (which is a pain), so I use a backup SMTP server in those cases.
While the default is 10 MB, in 2021 it is now possible to request Amazon increase your maximum message size to up to 40 MB as per https://aws.amazon.com/ses/faqs/#49.