I'm working on this weird project where I need to upload huge files (500mb+) and store them in a postgresql table as a blob.
Before you start telling me that storing files in the database is bad and that I should use the filesystem instead, hear (read) my reasons:
- The access to the files must be authorized through the app, where individual permissions to them are applied.
- I will need to keep an history (versioning) of each file, giving access to previous modifications.
- I can't afford the risk of losing consistency between the relations and the files/versions over the course of the years.
Now, on to the questions:
- Do I risk getting timeouts during upload/download or the download process counts as a 'keep-alive'?
- Are the files completely loaded into memory for each user or are they buffered (streamed) to the user? I need to allow some thousands of users to download big files at the same time.
- Are there any packages that can help me with this?
Aucun commentaire:
Enregistrer un commentaire