add ftp support & upload for url/ftp/s3. Reorg tests#1260
add ftp support & upload for url/ftp/s3. Reorg tests#1260
Conversation
|
/run-security-scan |
alexcos20
left a comment
There was a problem hiding this comment.
AI automated code review (Gemini 3).
Overall risk: medium
Summary:
This pull request introduces comprehensive support for FTP storage, including both download and upload capabilities. It also extends existing S3 and URL storage types with upload functionality. Key changes involve new type definitions, a dedicated FTPStorage class leveraging basic-ftp, and the integration of these new features into the core file information and HTTP route handlers. The existing storage tests have been refactored into dedicated integration test files for each storage type, with new tests covering FTP and upload functionalities. The formatMetadata logic in fileInfoHandler has been elegantly refactored to be more generic, relying on the individual storage classes for metadata retrieval.
Comments:
• [INFO][security] The S3 Key uses filename directly or after appending to objectKey. While S3 keys are generally permissive, it's good practice to ensure filename is sanitized or URL-encoded if it can contain characters that might interfere with S3 key interpretation or client-side display, even if encodeURIComponent isn't strictly necessary for the key itself. The ContentDisposition header correctly escapes quotes, which is good. Just a minor thought for consistency/robustness.
• [STYLE][other] Please remove this console.log(fileInfo) statement. It's not part of the test assertion and should not be left in production code.
• [INFO][style] Multiple license fields were removed from node_modules entries in package-lock.json. This is likely a side effect of npm install or npm update and not directly related to functional changes. It's generally fine, but worth noting to ensure no unintended metadata loss.
• [INFO][security] The parseFtpUrl function correctly handles ftp: and ftps: protocols. It's important to remember that ftp://user:password@host sends credentials in cleartext if secure is false. While this is a characteristic of FTP itself, the ftps: protocol is available for secure transmission. The unsafeURLs check also adds a layer of protection, which is good.
|
#1263 should be merged here first, before merging this into main |
FTP storage, upload support, and unified file metadata
Summary
This PR adds a fifth storage backend (FTP/FTPS), introduces upload support for URL/S3/FTP, centralizes all storage tests as integration tests, and routes all file metadata lookups (including
fileInfo) through the storage layer for consistency across URL, IPFS, Arweave, S3, and FTP.Changes
1. New storage backend: FTP
2. Upload support for URL, S3, and FTP
Storagebase class (src/components/storage/Storage.ts):hasUploadflag on the constructor (defaultfalse) to indicate whether a backend supports uploads.UrlStorage.upload(filename, stream)(src/components/storage/UrlStorage.ts):method.urlends with/, the filename is appended to the URL; otherwise the URL is treated as the full target.Content-Disposition: attachment; filename="..."and returns status + response headers.S3Storage.upload(filename, stream)(src/components/storage/S3Storage.ts):@aws-sdk/lib-storageUploadfor multipart streaming, so large files are uploaded in parts without buffering the entire stream in memory.s3Access.objectKeyends with/, the effective key becomesobjectKey + filename; otherwiseobjectKeyis used as-is.ContentTypeandContentDispositionon the uploaded object.FTPStorage.upload(filename, stream):urlends with/, the filename is appended, otherwise it is used as the full path (with a default/filenamewhen the path is empty).3. Dependencies
basic-ftp(^5.2.0): FTP/FTPS client forFTPStorage.@aws-sdk/lib-storage(^3.1002.0): S3 multipart upload helper forS3Storage.upload.4. FileInfo API and metadata handling
src/components/httpRoutes/fileInfo.ts):type: 'url' | 'ipfs' | 'arweave' | 's3' | 'ftp'ordid + serviceId.type: 'ftp', builds aFtpFileObjectfrom theurland forwards it to the P2PFILE_INFOcommand.src/components/core/handler/fileInfoHandler.ts):formatMetadatanow takes aStorageObjectand always callsStorage.getStorageClass(file, config).fetchSpecificFileMetadata(file, false)for all backends (URL, IPFS, Arweave, S3, FTP).FileInfoCommandis invoked withfile + type, it still usesstorage.getFileInfo, which delegates to backend-specificfetchSpecificFileMetadata.did + serviceId, all files in the asset are resolved and their metadata fetched via the storage layer, so S3 and FTP files obtained via DID now have first-class metadata support.5. Test reorganization and coverage
src/test/unit/storage.test.ts(all storage tests moved to integration).src/test/integration/storage/with backend-specific suites:urlStorage.test.ts:s3Storage.test.ts:ftpStorage.test.ts:ipfsStorage.test.tsandarweaveStorage.test.ts:6. Documentation
docs/Storage.md:type,url), validation rules, and interaction withunsafeURLs./appends filename to the path).