Conversation
…ods to return input directly
… of LocalFileUploaderService
…file deletion logic in BookService
Updated the `<Project Sdk="Microsoft.NET.Sdk.Web">` line to include a leading space for consistency. No functional changes were made to the project structure.
- Added IFileUploaderService dependency to BookController. - Updated DownloadFile method to use file name in route. - Implemented error handling for file download failures. - Added [Authorize] attribute to DownloadFile method. - Removed unused local file path handling code.
Updated the `BookService` class to remove related `BookAuthors` when a `Book` is deleted. This change ensures referential integrity by preventing orphaned `BookAuthors` in the database.
This commit introduces a new asynchronous method `DownloadFileAsync` in the `S3FileUploaderService` class. The method allows for downloading files from an S3 bucket using a specified `fileName`. It handles the creation of a `GetObjectRequest`, retrieves the file, and returns it as a `MemoryStream`. Additionally, it includes error handling for potential `AmazonS3Exception` cases, providing a clear error message if the file is not found or the key is incorrect.
Introduced a new method `DownloadFileAsync` in the `IFileUploaderService` interface for asynchronous file download functionality, accepting a `string fileName` parameter and returning a `Task<Stream>`.
- Added `<UserSecretsId>` to `Refhub.csproj` for user secrets management. - Updated `appsettings.Development.json` to remove hardcoded AWS `AccessKey` and `SecretKey`, replacing them with empty strings for improved security.
|
Caution Review failedThe pull request is closed. WalkthroughThis update migrates file management from local storage to S3-compatible cloud storage (ArvanCloud), introduces new configuration and error handling for file operations, and revises the BookController and related services accordingly. It also enforces URL validation for book images, adds new enums and static helpers for bucket/directory naming, and updates database schema migrations and resource files. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant BookController
participant S3FileUploaderService
participant S3 (ArvanCloud)
participant Logger
participant MessageService
User->>BookController: DownloadFile(fileUrl)
BookController->>BookController: Validate fileUrl
alt Invalid fileUrl
BookController->>User: BadRequest (InvalidFileName)
else Valid fileUrl
BookController->>S3FileUploaderService: DownloadFileAsync(fileUrl, ...)
alt S3 returns file stream
BookController->>User: File (with content type)
else FileDownloadException
BookController->>Logger: Log error
BookController->>User: NotFound (FileNotFound)
else Other Exception
BookController->>Logger: Log error
BookController->>User: InternalServerError (DownloadError)
end
end
Estimated code review effort3 (~45 minutes) Possibly related PRs
Suggested reviewers
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Summary of Changes
Hello @MehdiKh-hub, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request implements S3 cloud storage integration, specifically targeting ArvanCloud, to handle file uploads, downloads, and deletions for books and associated files. It replaces the existing local file storage logic by introducing a new S3 service implementation and updating the relevant application code and interfaces. Additionally, an unrelated database migration is included to optimize the Books table slug column.
Highlights
- S3 Integration: Introduced integration with S3 (specifically configured for ArvanCloud) for file storage, replacing the previous local file storage mechanism.
- File Uploader Service Update: The
IFileUploaderServiceinterface and its implementations (LocalFileUploaderServiceand the newS3FileUploaderService) were updated to support S3 operations, including a newDownloadFileAsyncmethod and a simplifiedDeleteFilesignature. - Dependency Injection Switch: The application's dependency injection configuration was updated to use the new
S3FileUploaderServiceas the concrete implementation forIFileUploaderService. - Book File Handling Logic: The
BookServiceandBookControllerwere updated to utilize the new S3-compatible file uploader service methods and handle file paths/URLs appropriate for S3. - Database Migration: A new database migration was added to change the
Slugcolumn in theBookstable tonvarchar(450)and add a unique index on it.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configureGemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Code Review
This pull request introduces S3 integration for file storage. Key changes include a new S3FileUploaderService, updates to controllers and services to use S3, and related configuration and model adjustments. Several critical and high-priority issues need addressing, including the S3 URL generation logic, error handling in the BookController, ensuring data integrity during book deletion, and completing the implementation of LocalFileUploaderService.
| await uploaderService.DeleteFile(FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.FilePath); | ||
| await uploaderService.DeleteFile( book.FilePath); | ||
| } | ||
| context.BookAuthors.RemoveRange(book.BookAuthors); |
There was a problem hiding this comment.
The line context.BookAuthors.RemoveRange(book.BookAuthors); is likely redundant. Also, the BookRelation entity is configured with OnDelete(DeleteBehavior.Restrict), which is not handled in DeleteBookAsync and will lead to runtime errors. Either prevent deletion of books with active relations, or explicitly remove/update these relations before deleting the book.
// Consider removing this line if cascade delete for BookAuthors is correctly configured and working.
// context.BookAuthors.RemoveRange(book.BookAuthors);
// CRITICAL: Handle BookRelations before deleting the book due to DeleteBehavior.Restrict
// Example: Load and remove related BookRelations
// var relations = await context.BookRelations
// .Where(br => br.BookId == bookId || br.RelatedBookId == bookId)
// .ToListAsync(ct);
// if (relations.Any()) context.BookRelations.RemoveRange(relations);
// Ensure BookKeywords are also handled if not covered by cascade delete (they appear to be).| private string GenerateS3Url(string key) | ||
| { | ||
| // برای آروان کلاد: | ||
| return $"https://{_bucketName}.{_region}/{key}"; |
There was a problem hiding this comment.
The S3 URL generated here (https://{_bucketName}.{_region}/{key}) follows a virtual-hosted style format, but the S3 client is configured with ForcePathStyle = true. Verify ArvanCloud's S3 API documentation regarding ForcePathStyle and the correct format for public object URLs to ensure consistency and correctness.
| public Task DeleteFile(string realUrl) | ||
| { | ||
| throw new NotImplementedException(); | ||
| } |
| return NotFound(); | ||
| } | ||
| var stream = await _s3FileUploaderService.DownloadFileAsync(fileName); | ||
| var contentType = "application/octet-stream"; // یا محتوای واقعی بر اساس پسوند |
There was a problem hiding this comment.
The contentType is hardcoded. Consider determining the content type dynamically based on the file extension for a better user experience.
// Example: inject FileExtensionContentTypeProvider or create an instance
// private readonly FileExtensionContentTypeProvider _contentTypeProvider = new FileExtensionContentTypeProvider();
if (!new Microsoft.AspNetCore.StaticFiles.FileExtensionContentTypeProvider().TryGetContentType(fileName, out var contentType))
{
contentType = "application/octet-stream"; // Fallback
}| DirectoryTypes.Books => nameof(DirectoryTypes.Books), | ||
| DirectoryTypes.Files => nameof(DirectoryTypes.Files), | ||
|
|
||
| _ => "Order" |
There was a problem hiding this comment.
The default case _ => "Order" in the switch expression for GetDirectoryName could lead to unexpected behavior. Consider throwing an ArgumentOutOfRangeException for unhandled enum values to make such issues explicit.
_ => throw new System.ArgumentOutOfRangeException(nameof(folder), $"Unsupported directory type: {folder}")There was a problem hiding this comment.
Actionable comments posted: 8
🔭 Outside diff range comments (4)
Refhub/Tools/ExtentionMethod/PathExtionMethod.cs (1)
5-15: 🛠️ Refactor suggestionIdentity methods & naming inconsistencies undermine the utility of this helper.
• Both
ConvertForBookPathImageandConvertForBookPathFileare now pure identity functions; they add no value and invite confusion.
• Class/extension suffix is misspelled (ExtionMethod→ExtensionMethod).
Consider deleting the extensions entirely and inlining the call-sites, or restore the previous path-building logic behind the newDirectoryTypesabstraction.-public static string ConvertForBookPathImage(this string imageName) -{ - return imageName; -} -public static string ConvertForBookPathFile(this string fileName) -{ - return fileName; -} +// Remove these two methods and update call-sites accordingly, +// or re-implement them to return $"{DirectoryTypes.BookImages}/{imageName}" etc.Refhub/Service/Implement/BookService.cs (2)
208-216:⚠️ Potential issueDelete happens before successful re-upload – risk of permanent data loss
In the update path you
await uploaderService.DeleteFile(existingBook.FilePath); // … existingBook.FilePath = await uploaderService.UploadFile(...)If the new upload fails, the original file is already gone.
Swap the order (upload → verify → delete old) or perform a copy-on-write approach.-await uploaderService.DeleteFile(existingBook.FilePath); -existingBook.FilePath = await uploaderService.UploadFile(...); +var newPath = await uploaderService.UploadFile(...); +if (!string.IsNullOrWhiteSpace(newPath)) +{ + await uploaderService.DeleteFile(existingBook.FilePath); + existingBook.FilePath = newPath; +}
256-272:⚠️ Potential issue
BookAuthorsnot eagerly loaded – possibleNullReferenceExceptionand missing cascade delete
bookis fetched without.Include(b => b.BookAuthors), yetRemoveRange(book.BookAuthors)is invoked.
When lazy-loading is disabled (the default in EF Core) this will throw; when enabled it will incur N+1.
Either:var book = await context.Books .Include(b => b.BookAuthors) .FirstOrDefaultAsync(b => b.Id == bookId, ct);or rely on cascade delete and drop the explicit
RemoveRange.Refhub/Service/Implement/LocalFileUploaderService.cs (1)
7-28: 🛠️ Refactor suggestionMethod marked
asyncbut performs only sync IO
UploadFile:
- Uses synchronous
FileStream&CopyTo, blocking the thread.- Never
awaits – theasyncmodifier only adds overhead.Either switch to truly async APIs or drop
async/await.-public async Task<string> UploadFile(...) +public Task<string> UploadFile(...) { - ... - using (Stream stream = new FileStream(path, FileMode.Create)) - { - file.CopyTo(stream); - } - return name; + await using var stream = new FileStream(path, FileMode.Create, FileAccess.Write, FileShare.None, 4096, useAsync:true); + await file.CopyToAsync(stream); + return name; }Also,
new Random()per call is unsafe in parallel code; useGuid.NewGuid()orRandomNumberGenerator.
🧹 Nitpick comments (13)
Refhub/Models/Enums/DirectoryTypes.cs (1)
3-9: Enum naming & documentation could be tightened
- Consider adding XML‐doc comments for the enum itself and each value – consumers (especially in Swagger) will then see meaningful descriptions.
- “Files” is very generic; if it is acting as a catch-all, document that intent explicitly.
- If the numeric value 0 matters only for
None, explicitly set the rest as well for readability:None = 0, - Images, - Books, - Files, + Images = 1, + Books = 2, + Files = 3,Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (2)
1-4: Unused using directives
Refhub.Service.ImplementandRefhub.Service.Interfaceare not referenced in this file – remove them to keep compile-time warnings quiet.
9-14: Leverage Options pattern with validationCurrent binding registers plain
IOptions<S3Configuration>without validation. UsingAddOptionsenables automatic checks:- builder.Services.Configure<S3Configuration>( - builder.Configuration.GetSection("AWS:S3")); + builder.Services.AddOptions<S3Configuration>() + .Bind(builder.Configuration.GetSection("AWS:S3")) + .ValidateDataAnnotations() + .Validate(config => !string.IsNullOrWhiteSpace(config.ServiceURL), + "ServiceURL must not be empty in non-AWS environments.");This will fail fast at startup for broken configs instead of at first use.
Refhub/Refhub.csproj (1)
16-16: Isolate AWS SDK to where it is neededIf S3 usage is confined to one project, mark the package reference with
<PrivateAssets>all</PrivateAssets>to avoid leaking to consuming projects in a solution.- <PackageReference Include="AWSSDK.S3" Version="4.0.1.4" /> + <PackageReference Include="AWSSDK.S3" Version="4.0.1.4"> + <PrivateAssets>all</PrivateAssets> + </PackageReference>Refhub/Tools/ExtentionMethod/PathExtionMethod.cs (1)
1-1: Remove now-unusedRefhub.Tools.Staticimport.The
FolderNameStatichelper is no longer referenced after the refactor; keepusingdirectives minimal to avoid unnecessary dependencies.Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (1)
2-3: Namespace and folder structure look off.
Refhub.Service.Implement.S3_Sample.Serviceis overly nested and redundant (Serviceappears twice, plus the “S3_Sample” folder name leaks an implementation detail).
Prefer something likeRefhub.Service.Implement.S3or move the class beside the other implementers.Refhub/Migrations/20250611081248_fix-mig.cs (1)
8-9: Rename migration class to PascalCase to silence CS8981.
fixmigviolates C# naming conventions and produces compiler warnings. Rename toFixMig(and update the file name accordingly).🧰 Tools
🪛 GitHub Actions: 🚀 Build Pipeline
[warning] 8-8: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.
Refhub/Tools/Static/FolderNameStatic.cs (1)
7-17: Default case returns magic string"Order"
DirectoryTypes.None(or any future enum value) silently maps to"Order", which is misleading and brittle.
Consider either throwingArgumentOutOfRangeExceptionor adding an explicitOrdermember to the enum for clarity.Refhub/Service/Interface/IFileUploaderService.cs (1)
5-9: Parameter casing & naming consistency
string Typeviolates .NET naming conventions (type,category, etc.) while other parameters are camel-cased.
Minor, but keeping the API surface consistent aids discoverability.Refhub/Controllers/BookController.cs (1)
3-4: Avoid referencing concrete S3 implementation in controller
using Refhub.Service.Implement.S3_Sample.Service;is unnecessary; the controller depends only on the interface.
Keeping controllers free of implementation namespaces prevents accidental tight coupling.Refhub/Service/Implement/S3FileUploaderService.cs (2)
51-56:GetKeyrelies on brittle string replaceDirect string replacement will fail for mixed-case URLs, query strings, or when the service URL pattern changes.
Consider parsing via
Uri:-private string GetKey(string realUrl) -{ - return realUrl.Replace($"https://{_bucketName}.{_region}/",""); -} +private static string GetKey(string realUrl) +{ + var uri = new Uri(realUrl); + return uri.AbsolutePath.TrimStart('/'); +}
57-75: Sanitisenamebefore building the S3 keyOnly spaces are replaced; special characters (
/,\,?, …) will break S3 keys or allow directory traversal inside the bucket.Minimum hardening:
-var key = $"{directoryName}/{type}/{name.Replace(" ","-")}{Path.GetExtension(file.FileName)}"; +var safeName = Path.GetFileNameWithoutExtension(name) + .Replace(" ", "-") + .Replace("/", "-") + .Replace("\\", "-"); +var key = $"{directoryName}/{type}/{safeName}{Path.GetExtension(file.FileName)}";Refhub/Migrations/20250611081248_fix-mig.Designer.cs (1)
15-17: Migration class name triggers compiler warning
partial class fixmigviolates the C# naming rules (CS8981).
Rename the migration (e.g.,FixMig) and regenerate to avoid future keyword conflicts and improve readability.🧰 Tools
🪛 GitHub Actions: 🚀 Build Pipeline
[warning] 16-16: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (17)
Refhub/Controllers/BookController.cs(3 hunks)Refhub/Migrations/20250611081248_fix-mig.Designer.cs(1 hunks)Refhub/Migrations/20250611081248_fix-mig.cs(1 hunks)Refhub/Migrations/AppDbContextModelSnapshot.cs(3 hunks)Refhub/Models/AppSetting/S3Configuration.cs(1 hunks)Refhub/Models/Enums/DirectoryTypes.cs(1 hunks)Refhub/Program.cs(1 hunks)Refhub/Refhub.csproj(2 hunks)Refhub/Service/Implement/BookService.cs(4 hunks)Refhub/Service/Implement/LocalFileUploaderService.cs(2 hunks)Refhub/Service/Implement/S3FileUploaderService.cs(1 hunks)Refhub/Service/Interface/IFileUploaderService.cs(1 hunks)Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs(2 hunks)Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs(1 hunks)Refhub/Tools/ExtentionMethod/PathExtionMethod.cs(1 hunks)Refhub/Tools/Static/FolderNameStatic.cs(1 hunks)Refhub/appsettings.Development.json(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1)
Refhub/Models/AppSetting/S3Configuration.cs (1)
S3Configuration(3-10)
Refhub/Service/Implement/BookService.cs (1)
Refhub/Tools/Static/FolderNameStatic.cs (2)
FolderNameStatic(5-18)GetDirectoryName(7-17)
Refhub/Service/Implement/S3FileUploaderService.cs (1)
Refhub/Models/AppSetting/S3Configuration.cs (1)
S3Configuration(3-10)
🪛 GitHub Actions: 🚀 Build Pipeline
Refhub/Migrations/20250611081248_fix-mig.cs
[warning] 8-8: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.
Refhub/Service/Implement/LocalFileUploaderService.cs
[error] 5-5: CS0535: 'LocalFileUploaderService' does not implement interface member 'IFileUploaderService.DownloadFileAsync(string)'.
Refhub/Migrations/20250611081248_fix-mig.Designer.cs
[warning] 16-16: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.
🔇 Additional comments (6)
Refhub/Refhub.csproj (1)
6-8: Confirm .NET 9 target & AWS SDK version compatibilityTargeting
net9.0(currently preview) with AWSSDK.S3 4.0.1.4 is bold; verify runtime compatibility before production deployment. Consider pinning to LTS (net8.0) unless a preview is intentional.Refhub/Program.cs (1)
24-28: Bind after configuration sources are added
builder.BindS3Model()runs before the extraAddJsonFile(...)calls. While the default files are already loaded, any values coming from the additional environment-specific file may be missed. Safer order:-#region CustomExtentionMethod -builder.BindS3Model(); -builder.Services.AddCustomService(); +// Additional config sources +builder.Configuration + .SetBasePath(Directory.GetCurrentDirectory()) + .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true) + .AddJsonFile($"appsettings.{builder.Environment.EnvironmentName}.json", optional: true) + .AddEnvironmentVariables(); + +// Bind after all sources are in place +builder.BindS3Model(); + +builder.Services.AddCustomService();Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (1)
18-19: Verify lifetime & configuration ofS3FileUploaderService.
AddScopedis fine if the service is stateless. If it wraps anIAmazonS3client, that client should be a singleton to avoid socket exhaustion; ensure the service internally resolves a singleton client from DI instead of constructing per request.Also double-check that
S3Configurationis bound before this registration (startup order matters).Refhub/Migrations/AppDbContextModelSnapshot.cs (1)
20-21: Auto-generated file – no manual changes required.Snapshot reflects the new unique index and updated column type. Looks correct.
Refhub/Service/Implement/BookService.cs (1)
157-159: Confirm uploader returns a globally unique key, not just a local filename
UploadFileis invoked with enum-based folder names, which is 👍.
However, both current S3 and local implementations appear to return only the generated file name (no prefix / folder).
If you later compose a presigned URL or need to address the object in S3, a mere filename will collide across folders/buckets.
Please verify the contract – it should probably return the full object key (or absolute URL) instead.Refhub/Service/Implement/S3FileUploaderService.cs (1)
31-42: Region handling is inconsistentYou hard-code
RegionEndpoint.USEast1while separately storing_regionfrom configuration and later building URLs from it.
If the bucket actually lives outsideus-east-1, requests succeed only because a customServiceURLis supplied; signing/URL generation may still break.Either:
- Pick the correct enum from
RegionEndpoint.GetBySystemName(_s3Options.Value.Region)(preferred), or- Omit
RegionEndpointwhenServiceURLis set.
| "AWS": { | ||
| "S3": { | ||
| "Region": "s3.ir-thr-at1.arvanstorage.ir", | ||
| "AccessKey": "", | ||
| "SecretKey": "", | ||
| "BucketName": "test12333555", | ||
| "ServiceURL": "https://s3.ir-thr-at1.arvanstorage.ir/" | ||
| } | ||
| }, |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Avoid committing secrets & clarify S3 settings.
• AccessKey / SecretKey must never be committed with real values; rely on environment variables or dotnet user-secrets in local dev.
• Region currently contains a full endpoint. Conventionally this should be the region code (ir-thr-at1) while the endpoint/ServiceURL carries the scheme & host. Mislabeling may break SDK region resolution.
"AWS": {
"S3": {
"Region": "ir-thr-at1",
"ServiceURL": "https://s3.ir-thr-at1.arvanstorage.ir",
"AccessKey": "${AWS_ACCESS_KEY_ID}",
"SecretKey": "${AWS_SECRET_ACCESS_KEY}"
}
}🤖 Prompt for AI Agents
In Refhub/appsettings.Development.json lines 2 to 10, do not commit real AWS
AccessKey and SecretKey values; instead, replace them with placeholders or
environment variable references to avoid exposing secrets. Also, update the
Region field to only contain the region code (e.g., "ir-thr-at1") rather than
the full endpoint URL, and ensure the ServiceURL field contains the full scheme
and host URL. This aligns with AWS SDK conventions and prevents region
resolution issues.
| migrationBuilder.AlterColumn<string>( | ||
| name: "Slug", | ||
| table: "Books", | ||
| type: "nvarchar(450)", | ||
| nullable: false, | ||
| oldClrType: typeof(string), | ||
| oldType: "nvarchar(max)"); | ||
|
|
||
| migrationBuilder.CreateIndex( | ||
| name: "IX_Books_Slug", | ||
| table: "Books", | ||
| column: "Slug", | ||
| unique: true); | ||
| } |
There was a problem hiding this comment.
💡 Verification agent
❓ Verification inconclusive
Unique index may fail on existing duplicate slugs.
The migration assumes all current Books.Slug values are unique and ≤ 450 chars. If either condition is violated the deploy will fail.
Recommend adding a data-cleanup step (or a pre-deployment check) before applying the migration:
// Example check
if (migrationBuilder.Sql("SELECT Slug, COUNT(*) FROM Books GROUP BY Slug HAVING COUNT(*) > 1").Any())
throw new InvalidOperationException("Duplicate slugs detected; manual cleanup required.");Add pre-migration checks for duplicate or over‐length slugs
Creating a unique index on Books.Slug and shrinking it to nvarchar(450) will fail if there are existing duplicate values or slugs longer than 450 characters. Please add validation steps in the Up() method before altering the column:
• File: Refhub/Migrations/20250611081248_fix-mig.cs
Location: inside protected override void Up(MigrationBuilder migrationBuilder), before the AlterColumn call.
Suggested patch:
public partial class fix_mig : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
+ // Prevent migration if there are duplicate slugs
+ migrationBuilder.Sql(@"
+ IF EXISTS (
+ SELECT Slug
+ FROM Books
+ GROUP BY Slug
+ HAVING COUNT(*) > 1
+ )
+ THROW 51000, 'Duplicate slugs detected; manual cleanup required.', 1;
+ ");
+
+ // Prevent migration if any slug exceeds 450 characters
+ migrationBuilder.Sql(@"
+ IF EXISTS (
+ SELECT 1
+ FROM Books
+ WHERE LEN(Slug) > 450
+ )
+ THROW 51000, 'Slugs longer than 450 characters detected; manual cleanup required.', 1;
+ ");
migrationBuilder.AlterColumn<string>(
name: "Slug",
table: "Books",
type: "nvarchar(450)",
nullable: false,
oldClrType: typeof(string),
oldType: "nvarchar(max)");
migrationBuilder.CreateIndex(
name: "IX_Books_Slug",
table: "Books",
column: "Slug",
unique: true);
}
}Alternatively, run a one-off cleanup script or pre-deployment job to remove duplicates and truncate or reject over-long slugs.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| migrationBuilder.AlterColumn<string>( | |
| name: "Slug", | |
| table: "Books", | |
| type: "nvarchar(450)", | |
| nullable: false, | |
| oldClrType: typeof(string), | |
| oldType: "nvarchar(max)"); | |
| migrationBuilder.CreateIndex( | |
| name: "IX_Books_Slug", | |
| table: "Books", | |
| column: "Slug", | |
| unique: true); | |
| } | |
| public partial class fix_mig : Migration | |
| { | |
| protected override void Up(MigrationBuilder migrationBuilder) | |
| { | |
| // Prevent migration if there are duplicate slugs | |
| migrationBuilder.Sql(@" | |
| IF EXISTS ( | |
| SELECT Slug | |
| FROM Books | |
| GROUP BY Slug | |
| HAVING COUNT(*) > 1 | |
| ) | |
| THROW 51000, 'Duplicate slugs detected; manual cleanup required.', 1; | |
| "); | |
| // Prevent migration if any slug exceeds 450 characters | |
| migrationBuilder.Sql(@" | |
| IF EXISTS ( | |
| SELECT 1 | |
| FROM Books | |
| WHERE LEN(Slug) > 450 | |
| ) | |
| THROW 51000, 'Slugs longer than 450 characters detected; manual cleanup required.', 1; | |
| "); | |
| migrationBuilder.AlterColumn<string>( | |
| name: "Slug", | |
| table: "Books", | |
| type: "nvarchar(450)", | |
| nullable: false, | |
| oldClrType: typeof(string), | |
| oldType: "nvarchar(max)"); | |
| migrationBuilder.CreateIndex( | |
| name: "IX_Books_Slug", | |
| table: "Books", | |
| column: "Slug", | |
| unique: true); | |
| } | |
| protected override void Down(MigrationBuilder migrationBuilder) | |
| { | |
| // ... existing rollback logic ... | |
| } | |
| } |
🤖 Prompt for AI Agents
In Refhub/Migrations/20250611081248_fix-mig.cs around lines 13 to 26, before
altering the Slug column and creating the unique index, add a pre-migration
validation step in the Up() method to check for duplicate Slug values and any
Slugs longer than 450 characters. Use migrationBuilder.Sql to query for
duplicates and length violations, and if any are found, throw an
InvalidOperationException to prevent the migration from proceeding until manual
cleanup is done. This ensures the unique index creation and column alteration
will not fail due to existing data issues.
| using Microsoft.Extensions.Options; | ||
| using Refhub.Models.AppSetting; | ||
| using Refhub.Service.Interface; | ||
|
|
There was a problem hiding this comment.
Missing core using directives – code will not compile
IFormFile, Path, and MemoryStream are referenced but Microsoft.AspNetCore.Http and System.IO are not imported.
Compilation will fail with CS0246 / CS0103.
-using Microsoft.Extensions.Options;
+using System.IO;
+using Microsoft.AspNetCore.Http;
+using Microsoft.Extensions.Options;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| using Microsoft.Extensions.Options; | |
| using Refhub.Models.AppSetting; | |
| using Refhub.Service.Interface; | |
| using System.IO; | |
| using Microsoft.AspNetCore.Http; | |
| using Microsoft.Extensions.Options; | |
| using Refhub.Models.AppSetting; | |
| using Refhub.Service.Interface; |
🤖 Prompt for AI Agents
In Refhub/Service/Implement/S3FileUploaderService.cs at the top of the file
(lines 1 to 4), add the missing using directives for Microsoft.AspNetCore.Http
to resolve IFormFile, and System.IO to resolve Path and MemoryStream. This will
fix the compilation errors CS0246 and CS0103 caused by undefined types.
Implement cascade delete behavior for BookId and AuthorId in BookAuthorConfiguration to ensure related entries are automatically removed when a Book or Author is deleted.
Updated the `BookService` class to support file uploads for both book files and images using `IFileUploaderService`. Added validation for upload paths. Improved the book deletion method to load and remove related `BookAuthors` and `BookRelations`, ensuring all associated entities are handled correctly during deletion.
There was a problem hiding this comment.
Actionable comments posted: 2
🔭 Outside diff range comments (1)
Refhub/appsettings.Development.json (1)
1-10: Add missing AWS:S3 configurationThe code registers
S3Configurationfrom theAWS:S3section, but no such section is defined in any JSON/YAML settings. You must either add it to your environment-specific config file(s) or supply the values via environment variables.• Location requiring attention:
Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs(callsbuilder.Configuration.GetSection("AWS:S3"))• Recommended JSON snippet (e.g. in
appsettings.Development.json):{ "Logging": { … }, "AWS": { "S3": { "Region": "us-east-1", "AccessKey": "<YourAccessKey>", "SecretKey": "<YourSecretKey>", "BucketName": "<YourBucketName>" } } }Alternatively, set these via environment variables using the double-underscore convention:
- AWS__S3__Region
- AWS__S3__AccessKey
- AWS__S3__SecretKey
- AWS__S3__BucketName
♻️ Duplicate comments (15)
Refhub/Tools/Static/BucketNameStatic.cs (2)
7-17: LGTM! Clean enum-to-string mapping implementation.The static class provides a clean, type-safe way to convert
BucketNamesenum values to their string representations. The switch expression withnameof()is appropriate and the exception handling is correct.The past review comment about the exception message has been addressed - the current code correctly references "bucket name" rather than "directory type".
15-15: Fix the exception message to reference bucket names instead of directory type.The exception message incorrectly refers to "directory type" when this method handles bucket names.
Refhub/Areas/Admin/Views/ManageBook/Update.cshtml (1)
50-53: LGTM! Proper conditional rendering implemented.The conditional check
@if (!string.IsNullOrEmpty(Model.ImagePath))correctly prevents broken image links. This properly addresses the removal of theConvertForBookPathImage()extension method.The past review comment about URL validation at the model level and database migration verification remains valid but is outside the scope of this view file.
Refhub/Data/Configuration/BookAuthorConfiguration.cs (1)
13-22: LGTM! Cascade delete behavior properly implemented.The addition of
OnDelete(DeleteBehavior.Cascade)to both foreign key relationships ensures proper cleanup ofBookAuthorentries when either aBookorAuthoris deleted. This maintains referential integrity and reduces the need for manual cleanup in service layers.This addresses the previous review suggestions about implementing cascade delete behavior for consistency across the application.
Refhub/Components/Views/Home/LastBookView.cshtml (1)
28-31: Image validation is properly implemented.The conditional rendering with null/empty check for
ImagePathcorrectly prevents broken image tags, which addresses the S3 migration requirements.Refhub/Migrations/20250611081248_fix-mig.cs (1)
13-26: Add pre-migration validation for data integrity.Creating a unique index on
Slugand reducing the column size tonvarchar(450)will fail if there are existing duplicate values or slugs longer than 450 characters in the database.Refhub/Models/Books/UpdateBookVM.cs (1)
31-32: LGTM! URL validation properly implemented.The [Url] attribute correctly validates that ImagePath contains a valid URL, which aligns with the S3-based storage architecture. The localized error message using the Messages resource class is appropriate.
Refhub/Service/Implement/BookService.cs (3)
153-154: Slug existence check properly maintained.The slug existence check using
AnyAsyncis correctly implemented to prevent duplicate slugs and uses async operations as recommended in previous reviews.
259-260: Query optimization implemented.The simplified query without unnecessary
.Include(b => b.BookAuthors)improves performance as recommended in previous reviews, since cascade delete handles the relationship automatically.
276-280: Critical issue resolved: BookRelations handling added.The implementation now properly handles
BookRelationsbefore deleting a book, addressing the critical issue from previous reviews whereDeleteBehavior.Restrictwould cause runtime errors.Refhub/Service/Interface/IFileUploaderService.cs (1)
7-7: Interface correctly defines non-nullable bucketName parameter.The
bucketNameparameter is properly defined as non-nullablestring, which creates a stronger contract as suggested in previous reviews.Refhub/Controllers/BookController.cs (1)
13-13: Fix parameter naming to follow C# conventions.Primary constructor parameters should use camelCase naming without underscores.
-public class BookController(IBookService bookService, IFileUploaderService s3FileUploaderService, ILogger<BookController> logger, IMessageService messageService) : Controller +public class BookController(IBookService bookService, IFileUploaderService fileUploaderService, ILogger<BookController> logger, IMessageService messageService) : ControllerNote: You'll also need to update all references to these parameters in the controller methods to use the new names without underscores.
Refhub/Resources/Messages.Designer.cs (1)
1-25: Don't manually edit auto-generated files - configure the generator instead.This file is auto-generated and manual changes will be lost upon regeneration. To make the resources public, configure the resource generator properly:
- In the
.resxfile properties, change Custom Tool fromResXFileCodeGeneratortoPublicResXFileCodeGenerator- Or add to your
.csproj:<PropertyGroup> <PublicResXFileCodeGeneratorTargets>true</PublicResXFileCodeGeneratorTargets> </PropertyGroup>Refhub/Service/Implement/S3FileUploaderService.cs (2)
121-121: Use English for code comments.For consistency and international collaboration, use English for all code comments.
-// مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود +// For example, if the file doesn't exist or the key is incorrect
1-4: Add missing using directives for compilation.The code references
IFormFileandPathbut is missing required imports.using Microsoft.Extensions.Options; +using System.IO; +using Microsoft.AspNetCore.Http; using Refhub.Models.AppSetting; using Refhub.Service.Interface;
🧹 Nitpick comments (9)
Refhub/Program.cs (1)
24-30: MoveBindS3Model()after configuration sources to avoid stale binding
builder.BindS3Model()executes before the extraAddJsonFile(...)providers are registered.
If those JSON files (or env-vars) override the"AWS:S3"section, the boundS3Configurationwill not pick them up.-#region CustomExtentionMethod -builder.BindS3Model(); -builder.Services.AddCustomService(); - -builder.Configuration - .SetBasePath(Directory.GetCurrentDirectory()) - .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true) - .AddJsonFile($"appsettings.{builder.Environment.EnvironmentName}.json", optional: true) - .AddEnvironmentVariables(); +#region CustomExtentionMethod +builder.Configuration + .SetBasePath(Directory.GetCurrentDirectory()) + .AddJsonFile("appsettings.json", optional:false, reloadOnChange:true) + .AddJsonFile($"appsettings.{builder.Environment.EnvironmentName}.json", optional:true) + .AddEnvironmentVariables(); + +builder.BindS3Model(); +builder.Services.AddCustomService();Re-ordering guarantees the latest values are bound.
Refhub/Views/Book/BookDetails.cshtml (1)
24-26:Url.Contentnot needed for absolute S3 URLs
Url.Content()is only useful for virtual paths.
IfModel.ImagePathalready contains a full URL, output it directly:-<img src="@Url.Content(Model.ImagePath)" … /> +<img src="@Model.ImagePath" … />Refhub/Areas/Admin/Views/ManageBook/Index.cshtml (1)
1-3: Remove unusedRefhub.Tools.ExtensionMethodimportThe view no longer uses any extension methods from that namespace; the
@usingdirective can be dropped.Refhub/Models/Enums/BucketNames.cs (1)
7-7: Fix formatting inconsistency.Remove the extra whitespace before
BookPdfto maintain consistent indentation.- BookPdf, + BookPdf,Refhub/Data/Configuration/BookConfiguration.cs (2)
30-30: Remove duplicate semicolon.There's a double semicolon at the end of the line that should be removed.
- .HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Cascade); ; + .HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Cascade);
36-36: Remove duplicate semicolon.There's a double semicolon at the end of the line that should be removed.
- .HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Cascade); ; + .HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Cascade);Refhub/Controllers/BookController.cs (1)
65-65: Add missing space after comma.-return StatusCode(500,messageService.Get("DownloadError")); +return StatusCode(500, messageService.Get("DownloadError"));Refhub/Service/Implement/S3FileUploaderService.cs (2)
37-37: Remove commented out code.If
_bucketNamefield is not needed, remove this commented line entirely.-//_bucketName = s3Options.Value.BucketName;
32-32: Replace Persian comments with English.For consistency and better collaboration, use English for all code comments.
-ForcePathStyle = true // بسیار مهم برای کار با آروانکلاد +ForcePathStyle = true // Very important for working with ArvanCloud -// برای آروان کلاد: +// For ArvanCloud: -// برای آروان کلاد: +// For ArvanCloud:Also applies to: 41-41, 48-48
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (33)
Refhub/Areas/Admin/Views/ManageBook/Index.cshtml(1 hunks)Refhub/Areas/Admin/Views/ManageBook/Update.cshtml(1 hunks)Refhub/Components/Views/Home/BestBookView.cshtml(1 hunks)Refhub/Components/Views/Home/LastBookView.cshtml(1 hunks)Refhub/Controllers/BookController.cs(3 hunks)Refhub/Data/Configuration/BookAuthorConfiguration.cs(1 hunks)Refhub/Data/Configuration/BookConfiguration.cs(1 hunks)Refhub/Migrations/20250611081248_fix-mig.Designer.cs(1 hunks)Refhub/Migrations/20250611081248_fix-mig.cs(1 hunks)Refhub/Migrations/20250612083534_AddUniqueIndexToBookSlugAndModifyType.cs(1 hunks)Refhub/Models/AppSetting/S3Configuration.cs(1 hunks)Refhub/Models/Books/UpdateBookVM.cs(2 hunks)Refhub/Models/Enums/BucketNames.cs(1 hunks)Refhub/Models/Enums/DirectoryTypes.cs(1 hunks)Refhub/Program.cs(2 hunks)Refhub/Refhub.csproj(1 hunks)Refhub/Resources/Messages.Designer.cs(34 hunks)Refhub/Resources/Messages.fa.resx(2 hunks)Refhub/Resources/Messages.resx(1 hunks)Refhub/Service/Implement/BookService.cs(5 hunks)Refhub/Service/Implement/LocalFileUploaderService.cs(0 hunks)Refhub/Service/Implement/S3FileUploaderService.cs(1 hunks)Refhub/Service/Interface/IFileUploaderService.cs(1 hunks)Refhub/Tools/Exceptions/EntityNotFoundException.cs(0 hunks)Refhub/Tools/Exceptions/FileDownloadException.cs(1 hunks)Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs(2 hunks)Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs(1 hunks)Refhub/Tools/ExtentionMethod/PathExtionMethod.cs(0 hunks)Refhub/Tools/Static/BucketNameStatic.cs(1 hunks)Refhub/Tools/Static/FolderNameStatic.cs(1 hunks)Refhub/Views/Book/BookDetails.cshtml(2 hunks)Refhub/Views/Book/List.cshtml(1 hunks)Refhub/appsettings.Development.json(1 hunks)
💤 Files with no reviewable changes (3)
- Refhub/Tools/ExtentionMethod/PathExtionMethod.cs
- Refhub/Tools/Exceptions/EntityNotFoundException.cs
- Refhub/Service/Implement/LocalFileUploaderService.cs
🧰 Additional context used
🧬 Code Graph Analysis (3)
Refhub/Data/Configuration/BookAuthorConfiguration.cs (1)
Refhub/Data/Models/Author.cs (1)
Author(3-12)
Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1)
Refhub/Models/AppSetting/S3Configuration.cs (1)
S3Configuration(8-15)
Refhub/Migrations/20250611081248_fix-mig.Designer.cs (2)
Refhub/Migrations/AppDbContextModelSnapshot.cs (1)
DbContext(13-541)Refhub/Migrations/20250611081248_fix-mig.cs (1)
fixmig(8-43)
🔇 Additional comments (32)
Refhub/Components/Views/Home/BestBookView.cshtml (2)
24-27: LGTM! Proper null validation for image rendering.The conditional check
@if (!string.IsNullOrEmpty(item.ImagePath))correctly prevents broken image links whenImagePathis null or empty. This addresses the concern from the previous review about removing theConvertForBookPathImage()extension method.
24-27: LGTM! Proper null check prevents broken images.The conditional rendering correctly prevents broken images when
ImagePathis null or empty, which addresses the removal of the extension method.Refhub/Views/Book/List.cshtml (1)
103-107: Null-check guard looks goodThe conditional rendering prevents broken images after the S3 migration.
No further issues spotted.Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (1)
17-20: Verify service prerequisites forS3FileUploaderServiceRegistration switch is correct.
Please confirm that:
BindS3Model()is executed (see Program.cs), and- All constructor dependencies of
S3FileUploaderService(e.g.,IOptions<S3Configuration>,IAmazonS3) are registered.Otherwise the app will fail at startup with
InvalidOperationException.Refhub/Areas/Admin/Views/ManageBook/Index.cshtml (1)
51-56: Image null-check added — good catchPrevents broken thumbnails in the admin grid.
Refhub/Models/Enums/DirectoryTypes.cs (1)
3-10: Well-designed enum implementation.The enum follows best practices with explicit zero value, descriptive names, and consistent formatting. Good foundation for strongly-typed directory management.
Refhub/Data/Configuration/BookConfiguration.cs (1)
29-30: Cascade delete behavior is appropriate.The cascade delete configuration for
RelatedToandRelatedFromrelationships ensures referential integrity when books are deleted, which is the correct behavior for these dependent relationships.Also applies to: 35-36
Refhub/Tools/Exceptions/FileDownloadException.cs (1)
1-13: LGTM! Well-structured custom exception class.The implementation follows standard .NET exception patterns with proper inheritance and constructor overloads for both message-only and message+inner exception scenarios. This will provide clear error handling for S3 file download operations.
Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1)
9-14: LGTM! Proper configuration binding implementation.The extension method correctly implements the configuration binding pattern using
Configure<T>and maintains the fluent interface by returning the builder. The "AWS:S3" configuration section path is appropriate for S3 settings.Refhub/Migrations/20250612083534_AddUniqueIndexToBookSlugAndModifyType.cs (1)
21-25: LGTM! Safe index dropping implementation.The raw SQL implementation correctly addresses the previous concern about migration failures by using a conditional check against
sys.indexesbefore dropping the existing index. This ensures the migration won't fail if the index doesn't exist.Refhub/Refhub.csproj (4)
7-7: LGTM! UserSecretsId enables secure configuration management.The addition of UserSecretsId is appropriate for managing sensitive AWS S3 credentials securely during development.
16-16: LGTM! AWS SDK package version is up-to-date.The AWSSDK.S3 package version 4.0.4.2 is the latest version and addresses performance improvements mentioned in previous reviews.
28-30: LGTM! Proper configuration for public resource generation.The PublicResXFileCodeGeneratorTargets property correctly enables public access to generated resource classes, which is necessary for validation attributes in the model classes.
33-33: LGTM! Resource generator change enables public access.Changing from ResXFileCodeGenerator to PublicResXFileCodeGenerator is appropriate to make the Messages class public, enabling its use in validation attributes across the application.
Refhub/Models/AppSetting/S3Configuration.cs (1)
1-16: Excellent implementation following previous feedback.The S3Configuration class properly implements all the suggestions from previous reviews:
- Uses
sealedmodifier for immutability- Implements
initaccessors instead ofset- Includes
[Required]attributes for validation- Provides clear documentation about sensitive data handling
- Makes
ServiceURLnullable as appropriateRefhub/Tools/Static/FolderNameStatic.cs (1)
7-17: Good refactoring with proper error handling.The refactored implementation properly addresses the previous feedback by:
- Using enum-based approach for type safety
- Throwing
ArgumentOutOfRangeExceptionfor unhandled enum values instead of a default case- Using
nameoffor consistent string mapping- Providing clear error messages
This eliminates the risk of unexpected behavior from unhandled enum values.
Refhub/Resources/Messages.resx (1)
225-236: Well-structured resource entries for file handling.The new resource entries are appropriate for the S3 file handling functionality:
FileNotFound,InvalidFileName,DownloadErrorsupport error handlingBook_ImagePathMustBeUrlenforces URL validation requirements- Messages are clear and user-friendly
Refhub/Resources/Messages.fa.resx (2)
210-212: Translation corrected properly.The Persian translation for
Keyword_AlreadyExistshas been corrected to "کلید واژه از قبل وجود دارد" which properly conveys "already exists" rather than just "found."
225-236: Appropriate Persian translations for file handling.The Persian translations for the new file handling messages are accurate and user-friendly:
FileNotFound: "فایل مورد نظر پیدا نشد."InvalidFileName: "نام فایل وارد شده نامعتبر است"DownloadError: "خطا در دانلود فایل رخ داد"Book_ImagePathMustBeUrl: "مسیر تصویر کتاب باید یک URL معتبر باشد"Refhub/Service/Implement/BookService.cs (4)
8-8: Good addition of enum namespace.Adding the
Refhub.Models.Enumsnamespace supports the new bucket naming system with type safety.
160-161: Excellent migration to bucket-based file storage.The implementation properly uses the new enum-based bucket naming system:
BucketNameStatic.GetName(BucketNames.BookPdf)for PDF filesBucketNameStatic.GetName(BucketNames.BookImages)for images- Method name corrected from
UpdloadFiletoUploadFileThis provides type safety and consistency across the application.
215-229: Consistent bucket naming in update operations.The file update logic properly uses the new bucket naming system for both deletion and upload operations, ensuring consistency with the S3 migration.
268-274: Proper file deletion with bucket names.File deletion operations correctly use the new bucket naming system, ensuring consistency with the S3 storage migration.
Refhub/Service/Interface/IFileUploaderService.cs (1)
5-5: Good parameter naming improvement!The use of
bucketNameinstead ofdirectoryNamemakes the interface clearer and more aligned with S3 terminology.Refhub/Migrations/20250611081248_fix-mig.Designer.cs (1)
1-542: Auto-generated migration file looks correct.This EF Core migration designer file is auto-generated and properly defines the database schema including Identity entities and domain models. No manual modifications needed.
Refhub/Controllers/BookController.cs (1)
33-67: Well-implemented download endpoint with proper error handling.The implementation addresses previous review comments by:
- Using query parameter for fileUrl instead of route parameter
- Proper URL validation
- Dynamic content type determination
- Specific exception handling with localized messages
- Proper logging of errors
Refhub/Service/Implement/S3FileUploaderService.cs (3)
46-60: Good implementation of URL to key extraction.The method properly validates that the URL starts with the expected prefix before extracting the key, which is more robust than using simple string replacement.
62-80: Well-implemented upload method with unique file naming.The implementation correctly:
- Generates unique file names using GUID to prevent overwrites
- Uses proper stream disposal with
usingstatement- Sets appropriate public read permissions
98-124: Correct implementation avoiding stream disposal issues.The method properly copies the S3 response stream to a new MemoryStream before returning, which prevents the disposal issue mentioned in previous reviews. Good error handling with specific exception type.
Refhub/Areas/Admin/Views/ManageBook/Update.cshtml (1)
50-53: LGTM! Conditional rendering prevents broken image previews.The null check correctly prevents broken images in the admin interface when updating books without image paths.
Refhub/appsettings.Development.json (1)
1-9: LGTM! Clean configuration focused on logging.The simplified configuration appropriately removes connection strings and focuses on logging settings for development.
Refhub/Data/Configuration/BookAuthorConfiguration.cs (1)
13-21: LGTM! Proper cascade delete behavior implemented.The cascade delete configuration correctly ensures that
BookAuthorentries are automatically cleaned up when their parentBookorAuthorentities are deleted, maintaining referential integrity.
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces a major change by migrating file storage to S3. The implementation is well-structured, but there's a critical security issue regarding public file access and areas for improvement concerning code correctness and maintainability.
- correct parameter name from 'filePath' to 'fileUrl' in DownloadFile action link - ensure proper file download functionality for related books
- removed duplicate HttpGet attribute for the download endpoint
- set uploaded file ACL to private to prevent unauthorized access
- prevent cascade delete when deleting a book to avoid unintended data loss - restrict delete behavior for both RelatedTo and RelatedFrom relationships
There was a problem hiding this comment.
Actionable comments posted: 0
♻️ Duplicate comments (4)
Refhub/Service/Implement/S3FileUploaderService.cs (4)
1-16: Missing essential using directives for compilation.The code references
IFormFile,Path, andMemoryStreambut lacks the required using statements forMicrosoft.AspNetCore.HttpandSystem.IO.+using System.IO; +using Microsoft.AspNetCore.Http; using Microsoft.Extensions.Options; using Refhub.Models.AppSetting; using Refhub.Service.Interface;
84-85: Redundant null check for non-nullable parameter.Since the method signature uses a non-nullable
string bucketName, this null check is unnecessary and can be removed. The compiler ensures a non-null value is passed.- if (string.IsNullOrWhiteSpace(bucketName)) - throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName));
100-101: Redundant null check for non-nullable parameter.Similar to the DeleteFile method, this null check is unnecessary since
bucketNameis a non-nullable string parameter.- if (string.IsNullOrWhiteSpace(bucketName)) - throw new ArgumentException("Bucket name cannot be null or empty.", nameof(bucketName));
110-118: Consider memory efficiency for large files.The current implementation loads the entire file into memory, which could cause issues with large files. For better scalability, consider returning the S3 response stream directly or implementing streaming for large files.
However, the current approach may be acceptable depending on your use case and expected file sizes. The
MemoryStreamdisposal responsibility is correctly handled by the caller.
🧹 Nitpick comments (1)
Refhub/Service/Implement/S3FileUploaderService.cs (1)
37-37: Remove commented-out dead code.The commented assignment
_bucketName = s3Options.Value.BucketName;should be removed since the_bucketNamefield is no longer used in the current implementation.- //_bucketName = s3Options.Value.BucketName;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
Refhub/Controllers/BookController.cs(3 hunks)Refhub/Data/Configuration/BookConfiguration.cs(1 hunks)Refhub/Service/Implement/S3FileUploaderService.cs(1 hunks)Refhub/Views/Book/BookDetails.cshtml(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
- Refhub/Views/Book/BookDetails.cshtml
- Refhub/Data/Configuration/BookConfiguration.cs
- Refhub/Controllers/BookController.cs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: 🐳 Docker Build & Test
🔇 Additional comments (2)
Refhub/Service/Implement/S3FileUploaderService.cs (2)
39-60: LGTM! Robust URL handling implementation.The helper methods correctly handle URL generation and key extraction with improved robustness:
- Uses path-style URL format consistent with
ForcePathStyle = trueGetKeyproperly validates URL prefix before extraction- Uses
StartsWith+Substringinstead of fragileReplaceoperations
62-80: LGTM! Secure and robust file upload implementation.The upload method correctly addresses previous security and robustness concerns:
- Generates unique keys with GUID to prevent overwrites
- Uses
S3CannedACL.Privatefor secure access control- Properly disposes of file stream with
usingstatement
- remove commented-out line assigning bucket name - this line was unnecessary and did not affect functionality
- remove bucket name validation from DeleteFile and DownloadFileAsync methods - simplify the logic
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces S3 integration for file storage, enhancing scalability and reliability. Key improvements include the S3FileUploaderService, S3 configuration management, and improved data integrity with Entity Framework configurations. Code cleanup and bug fixes are also notable. Suggestions are provided to improve code clarity, consistency, and remove potential dead code.
| using Refhub.Models.Enums; | ||
|
|
||
| namespace Refhub.Tools.Static; | ||
|
|
||
| public static class FolderNameStatic | ||
| { | ||
| public static string GetDirectoryImages = "Images"; | ||
| public static string GetDirectoryBooks = "Books"; | ||
| public static string GetDirectoryFiles = "Files"; | ||
| public static string GetDirectoryName(DirectoryTypes folder) | ||
| { | ||
| return folder switch | ||
| { | ||
| DirectoryTypes.Images => nameof(DirectoryTypes.Images), | ||
| DirectoryTypes.Books => nameof(DirectoryTypes.Books), | ||
| DirectoryTypes.Files => nameof(DirectoryTypes.Files), | ||
|
|
||
| _ => throw new ArgumentOutOfRangeException(nameof(folder), $"Unsupported directory type: {folder}") | ||
| }; | ||
| } |
There was a problem hiding this comment.
This static class FolderNameStatic and the associated DirectoryTypes enum appear to be remnants of the previous local file storage implementation. Since the application has been migrated to use S3 and the LocalFileUploaderService has been removed, this code is likely unused.
If it's confirmed to be dead code, it should be removed to keep the codebase clean and avoid confusion.
| if (string.IsNullOrWhiteSpace(fileUrl) || !Uri.TryCreate(fileUrl, UriKind.Absolute, out _)) | ||
| { | ||
| return NotFound(messageService.Get("InvalidFileName")); | ||
| } |
| await context.Authors.AddAsync(author, ct); | ||
| await context.SaveChangesAsync(ct); | ||
| return true; |
There was a problem hiding this comment.
It's crucial to handle potential exceptions that might occur during the SaveChangesAsync operation. Wrapping this call in a try-catch block would allow you to log any errors and potentially retry the operation or take other corrective actions.
try
{
await context.SaveChangesAsync(ct);
}
catch (Exception ex)
{
// Log the exception or handle it appropriately
Console.WriteLine($"Error saving changes: {ex.Message}");
throw;
}| [Url(ErrorMessageResourceType = typeof(Messages), | ||
| ErrorMessageResourceName = nameof(Messages.Book_ImagePathMustBeUrl))] | ||
| public string? ImagePath { get; set; } |
- refine conditional statement for image display - ensure consistent code style for better readability
- remove directoryName parameter and use bucketName directly - improve code readability and maintainability
- remove FolderNameStatic class - the class is no longer needed
- add IsValidFileUrl method to check file URL validity - return NotFound if file URL is invalid
- enable client-side validation in admin area - include jquery, jquery-validation, and jquery-validation-unobtrusive libraries
- add a try-catch block around SaveChangesAsync to catch and log exceptions - rethrow the exception to ensure it's handled upstream
- implement client-side validation for manage book form
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (2)
Refhub/Service/Implement/S3FileUploaderService.cs (2)
1-4: Missing required using directives for compilation.The code references
IFormFile,Path, andMemoryStreamwithout importing their namespaces, which will cause compilation errors CS0246/CS0103.
80-93: Add input validation and error handling to delete method.The method lacks input validation and error handling, which could lead to runtime exceptions or silent failures.
public async Task DeleteFile(string realUrl, string bucketName) { + if (string.IsNullOrWhiteSpace(realUrl)) + throw new ArgumentException("URL cannot be null or empty", nameof(realUrl)); + + if (string.IsNullOrWhiteSpace(bucketName)) + throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName)); var key = GetKey(realUrl, bucketName); var request = new DeleteObjectRequest { BucketName = bucketName.ToLower(), Key = key }; - await _s3Client.DeleteObjectAsync(request); + try + { + await _s3Client.DeleteObjectAsync(request); + } + catch (AmazonS3Exception ex) when (ex.StatusCode == System.Net.HttpStatusCode.NotFound) + { + // File already doesn't exist, consider this a successful deletion + } }
🧹 Nitpick comments (2)
Refhub/Service/Implement/S3FileUploaderService.cs (2)
8-16: Consider moving using directives to file level for better organization.While the current nested using directives work, it's more conventional to place all using statements at the file level for better readability and maintainability.
+using System.IO; +using Microsoft.AspNetCore.Http; +using Amazon; +using Amazon.Runtime; +using Amazon.S3; +using Amazon.S3.Model; +using Refhub.Models.Enums; +using Refhub.Resources; +using Refhub.Tools.Exceptions; +using System.Threading; using Microsoft.Extensions.Options; using Refhub.Models.AppSetting; using Refhub.Service.Interface; namespace Refhub.Service.Implement { - using Amazon; - using Amazon.Runtime; - using Amazon.S3; - using Amazon.S3.Model; - using Refhub.Models.Enums; - using Refhub.Resources; - using Refhub.Tools.Exceptions; - using System.Threading;
116-116: Remove Persian comment for consistency.Replace the Persian comment with English for better code maintainability and international collaboration.
- // مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود + // Handle cases where file doesn't exist or key is invalid
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (7)
Refhub/Areas/Admin/Views/ManageBook/Update.cshtml(2 hunks)Refhub/Areas/Admin/Views/Shared/_ValidationScriptsPartial.cshtml(1 hunks)Refhub/Controllers/BookController.cs(3 hunks)Refhub/Service/Implement/BookService.cs(5 hunks)Refhub/Service/Implement/S3FileUploaderService.cs(1 hunks)Refhub/Tools/Static/FolderNameStatic.cs(0 hunks)Refhub/Views/Shared/_ValidationScriptsPartial.cshtml(0 hunks)
💤 Files with no reviewable changes (2)
- Refhub/Views/Shared/_ValidationScriptsPartial.cshtml
- Refhub/Tools/Static/FolderNameStatic.cs
✅ Files skipped from review due to trivial changes (1)
- Refhub/Areas/Admin/Views/Shared/_ValidationScriptsPartial.cshtml
🚧 Files skipped from review as they are similar to previous changes (3)
- Refhub/Areas/Admin/Views/ManageBook/Update.cshtml
- Refhub/Service/Implement/BookService.cs
- Refhub/Controllers/BookController.cs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: 🐳 Docker Build & Test
🔇 Additional comments (4)
Refhub/Service/Implement/S3FileUploaderService.cs (4)
22-37: Constructor implementation looks good with proper configuration.The constructor correctly initializes the S3 client with dynamic region endpoint configuration and ArvanCloud-specific settings. The use of
ForcePathStyle = trueis appropriate for ArvanCloud compatibility.
38-42: URL generation method is correctly implemented.The
GenerateS3Urlmethod now properly constructs path-style URLs using the ServiceURL from configuration, which aligns with theForcePathStyle = truesetting and matches the extraction logic inGetKey.
45-59: Key extraction method has robust implementation.The
GetKeymethod correctly usesStartsWithvalidation andSubstringfor safe key extraction, which is more robust than the previousReplaceapproach. The English error message improves consistency.
61-78: File upload implementation is secure and well-structured.The method correctly uses
S3CannedACL.Privatefor security, generates unique keys with GUID to prevent overwrites, and properly handles the file stream. The key generation strategy ensures uniqueness while maintaining readability.
| public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName) | ||
| { | ||
|
|
||
| var key = GetKey(fileUrl, bucketName); | ||
| var request = new GetObjectRequest | ||
| { | ||
| BucketName = bucketName.ToLower(), | ||
| Key = key | ||
| }; | ||
|
|
||
| try | ||
| { | ||
| using var response = await _s3Client.GetObjectAsync(request, ct); | ||
| var ms = new MemoryStream(); | ||
| await response.ResponseStream.CopyToAsync(ms, ct); | ||
| ms.Position = 0; | ||
|
|
||
| return ms; | ||
| } | ||
| catch (AmazonS3Exception ex) | ||
| { | ||
| // مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود | ||
| throw new FileDownloadException("Error downloading file from S3", ex); | ||
| } | ||
| } |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Add input validation and consider memory efficiency improvements.
The download method lacks input validation and loads entire files into memory, which could be problematic for large files.
public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName)
{
+ if (string.IsNullOrWhiteSpace(fileUrl))
+ throw new ArgumentException("File URL cannot be null or empty", nameof(fileUrl));
+
+ if (string.IsNullOrWhiteSpace(bucketName))
+ throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName));
var key = GetKey(fileUrl, bucketName);
var request = new GetObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key
};
try
{
using var response = await _s3Client.GetObjectAsync(request, ct);
var ms = new MemoryStream();
await response.ResponseStream.CopyToAsync(ms, ct);
ms.Position = 0;
return ms;
}
catch (AmazonS3Exception ex)
{
- // مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود
throw new FileDownloadException("Error downloading file from S3", ex);
}
}For large files, consider returning the S3 response stream directly instead of copying to MemoryStream, but ensure proper disposal is handled by the caller.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName) | |
| { | |
| var key = GetKey(fileUrl, bucketName); | |
| var request = new GetObjectRequest | |
| { | |
| BucketName = bucketName.ToLower(), | |
| Key = key | |
| }; | |
| try | |
| { | |
| using var response = await _s3Client.GetObjectAsync(request, ct); | |
| var ms = new MemoryStream(); | |
| await response.ResponseStream.CopyToAsync(ms, ct); | |
| ms.Position = 0; | |
| return ms; | |
| } | |
| catch (AmazonS3Exception ex) | |
| { | |
| // مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود | |
| throw new FileDownloadException("Error downloading file from S3", ex); | |
| } | |
| } | |
| public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName) | |
| { | |
| if (string.IsNullOrWhiteSpace(fileUrl)) | |
| throw new ArgumentException("File URL cannot be null or empty", nameof(fileUrl)); | |
| if (string.IsNullOrWhiteSpace(bucketName)) | |
| throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName)); | |
| var key = GetKey(fileUrl, bucketName); | |
| var request = new GetObjectRequest | |
| { | |
| BucketName = bucketName.ToLower(), | |
| Key = key | |
| }; | |
| try | |
| { | |
| using var response = await _s3Client.GetObjectAsync(request, ct); | |
| var ms = new MemoryStream(); | |
| await response.ResponseStream.CopyToAsync(ms, ct); | |
| ms.Position = 0; | |
| return ms; | |
| } | |
| catch (AmazonS3Exception ex) | |
| { | |
| throw new FileDownloadException("Error downloading file from S3", ex); | |
| } | |
| } |
🤖 Prompt for AI Agents
In Refhub/Service/Implement/S3FileUploaderService.cs around lines 95 to 119, add
input validation to check that fileUrl and bucketName are not null or empty
before proceeding. To improve memory efficiency for large files, modify the
method to return the S3 response stream directly instead of copying it into a
MemoryStream, and update the method's documentation to indicate that the caller
is responsible for disposing the returned stream.
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request migrates the application to use S3 for file storage. The changes include updates to the controller, service, and interface layers to support S3 file uploads and downloads. A critical bug was identified in the URL validation logic, and suggestions were made to improve logging and support cancellation of file operations.
| } | ||
| private bool IsValidFileUrl(string fileUrl) | ||
| { | ||
| return !string.IsNullOrWhiteSpace(fileUrl) || !Uri.TryCreate(fileUrl, UriKind.Absolute, out _); |
There was a problem hiding this comment.
The condition !string.IsNullOrWhiteSpace(fileUrl) || !Uri.TryCreate(fileUrl, UriKind.Absolute, out _) uses a logical OR, which means the method returns true if the URL is not whitespace OR if the URL is invalid. This is likely incorrect. It should use a logical AND && to ensure the URL is both not whitespace AND a valid URI.
return !string.IsNullOrWhiteSpace(fileUrl) && Uri.TryCreate(fileUrl, UriKind.Absolute, out _);| catch (Exception ex) | ||
| { | ||
| // Log the exception or handle it appropriately | ||
| Console.WriteLine($"Error saving changes: {ex.Message}"); |
| var filePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug); | ||
| var imagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug); |
There was a problem hiding this comment.
The CancellationToken should be passed to the UploadFile method to allow for cancellation of the upload operation. This requires updating the IFileUploaderService interface and its implementations to accept a CancellationToken.
var filePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug, ct);
var imagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug, ct);| await uploaderService.DeleteFile(existingBook.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf)); | ||
| } | ||
|
|
||
| existingBook.FilePath = await uploaderService.UpdloadFile(book.File, FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.Slug); | ||
| existingBook.FilePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug); | ||
| } | ||
|
|
||
| if (book.Image != null) | ||
| { | ||
| if (!string.IsNullOrWhiteSpace(existingBook.ImagePath)) | ||
| { | ||
| await uploaderService.DeleteFile(FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, existingBook.ImagePath); | ||
| await uploaderService.DeleteFile(existingBook.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages)); | ||
| } | ||
|
|
||
| existingBook.ImagePath = await uploaderService.UpdloadFile(book.Image, FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.Slug); | ||
| existingBook.ImagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug); |
There was a problem hiding this comment.
Pass the CancellationToken to the DeleteFile and UploadFile methods to allow for cancellation of these operations.
if (book.File != null)
{
if (!string.IsNullOrWhiteSpace(existingBook.FilePath))
{
await uploaderService.DeleteFile(existingBook.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf), ct);
}
existingBook.FilePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug, ct);
}
if (book.Image != null)
{
if (!string.IsNullOrWhiteSpace(existingBook.ImagePath))
{
await uploaderService.DeleteFile(existingBook.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages), ct);
}
existingBook.ImagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug, ct);| await uploaderService.DeleteFile(book.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages)); | ||
| } | ||
|
|
||
| if (!string.IsNullOrWhiteSpace(book.FilePath)) | ||
| { | ||
| await uploaderService.DeleteFile(FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.FilePath); | ||
| await uploaderService.DeleteFile(book.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf)); |
There was a problem hiding this comment.
Pass the CancellationToken to the DeleteFile methods to allow for cancellation of these operations.
if (!string.IsNullOrWhiteSpace(book.ImagePath))
{
await uploaderService.DeleteFile(book.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages), ct);
}
if (!string.IsNullOrWhiteSpace(book.FilePath))
{
await uploaderService.DeleteFile(book.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf), ct);| public async Task<string> UploadFile(IFormFile file, string bucketName, string name) | ||
| { | ||
| var key = $"{name.Replace(" ", "-")}_{Guid.NewGuid()}{Path.GetExtension(file.FileName)}"; | ||
|
|
||
| using var stream = file.OpenReadStream(); | ||
| var request = new PutObjectRequest | ||
| { | ||
| BucketName = bucketName.ToLower(), | ||
| Key = key, | ||
| InputStream = stream, | ||
| ContentType = file.ContentType, | ||
|
|
||
| CannedACL = S3CannedACL.Private | ||
| }; | ||
|
|
||
| await _s3Client.PutObjectAsync(request); | ||
| return GenerateS3Url(key, bucketName); |
There was a problem hiding this comment.
Add a CancellationToken parameter to the UploadFile method and pass it to the PutObjectAsync method to allow for cancellation of the upload operation.
public async Task<string> UploadFile(IFormFile file, string bucketName, string name, CancellationToken ct)
{
var key = $"{name.Replace(" ", "-")}_{Guid.NewGuid()}{Path.GetExtension(file.FileName)}";
using var stream = file.OpenReadStream();
var request = new PutObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key,
InputStream = stream,
ContentType = file.ContentType,
CannedACL = S3CannedACL.Private
};
await _s3Client.PutObjectAsync(request, ct);
return GenerateS3Url(key, bucketName);
}| public async Task DeleteFile(string realUrl, string bucketName) | ||
| { | ||
|
|
||
|
|
||
| var key = GetKey(realUrl, bucketName); | ||
|
|
||
| var request = new DeleteObjectRequest | ||
| { | ||
| BucketName = bucketName.ToLower(), | ||
| Key = key | ||
| }; | ||
|
|
||
| await _s3Client.DeleteObjectAsync(request); |
There was a problem hiding this comment.
Add a CancellationToken parameter to the DeleteFile method and pass it to the DeleteObjectAsync method to allow for cancellation of the delete operation.
public async Task DeleteFile(string realUrl, string bucketName, CancellationToken ct)
{
var key = GetKey(realUrl, bucketName);
var request = new DeleteObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key
};
await _s3Client.DeleteObjectAsync(request, ct);
}| Task<string> UploadFile(IFormFile file, string bucketName, string name); | ||
| Task DeleteFile(string realUrl, string bucketName); | ||
| Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName); |
There was a problem hiding this comment.
Add a CancellationToken parameter to the UploadFile and DeleteFile methods to allow for cancellation of these operations.
Task<string> UploadFile(IFormFile file, string bucketName, string name, CancellationToken ct);
Task DeleteFile(string realUrl, string bucketName, CancellationToken ct);
Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName);- fix the condition in IsValidFileUrl to properly validate file URLs - ensure that the file URL is not null or whitespace and is a valid absolute URI
- add validation for fileUrl and bucketName to prevent empty values - throw ArgumentException if fileUrl or bucketName is null or empty
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces S3 support for file storage, enhancing scalability and robustness. The changes are comprehensive, covering data access, services, and views. Review comments focus on dependency injection, logging, data integrity, and localized error messages to improve maintainability and user experience.
| catch (Exception ex) | ||
| { | ||
| // Log the exception or handle it appropriately | ||
| Console.WriteLine($"Error saving changes: {ex.Message}"); | ||
| throw; |
There was a problem hiding this comment.
Using Console.WriteLine for logging exceptions is not suitable for production environments as it can be easily missed and doesn't provide structured logging capabilities. Inject ILogger<BookService> and use it for logging. For example:
_logger.LogError(ex, "Error saving changes: {Message}", ex.Message);// Log the exception or handle it appropriately
_logger.LogError(ex, "Error saving changes: {Message}", ex.Message);
throw;| public S3FileUploaderService(IOptions<S3Configuration> s3Options) | ||
| { | ||
| this._s3Options = s3Options; | ||
|
|
||
| var credentials = new BasicAWSCredentials(s3Options.Value.AccessKey, s3Options.Value.SecretKey); | ||
|
|
||
| var config = new AmazonS3Config | ||
| { | ||
| RegionEndpoint = RegionEndpoint.GetBySystemName(s3Options.Value.Region), // Dynamically set region endpoint | ||
| ServiceURL = s3Options.Value.ServiceURL, | ||
| ForcePathStyle = true // بسیار مهم برای کار با آروانکلاد | ||
| }; | ||
|
|
||
| _s3Client = new AmazonS3Client(credentials, config); | ||
|
|
||
| } |
There was a problem hiding this comment.
Instead of manually creating the IAmazonS3 client, register it with the dependency injection (DI) container and inject the IAmazonS3 interface directly. The AWS SDK for .NET provides helper methods for this, like services.AddAWSService<IAmazonS3>().
Register it in Program.cs:
builder.Services.AddDefaultAWSOptions(builder.Configuration.GetAWSOptions());
builder.Services.AddAWSService<IAmazonS3>();And then your service's constructor would be simplified to:
public S3FileUploaderService(IAmazonS3 s3Client, IOptions<S3Configuration> s3Options)
{
_s3Client = s3Client;
_s3Options = s3Options;
}| [Url(ErrorMessageResourceType = typeof(Messages), | ||
| ErrorMessageResourceName = nameof(Messages.Book_ImagePathMustBeUrl))] | ||
| public string? ImagePath { get; set; } |
There was a problem hiding this comment.
Consider validating ImagePath as a URL to ensure data integrity. Using the [Url] data annotation[^1] enforces that the ImagePath is a valid URL, which is important when using S3.
[Url(ErrorMessageResourceType = typeof(Messages),
ErrorMessageResourceName = nameof(Messages.Book_ImagePathMustBeUrl))]
public string? ImagePath { get; set; }| if (!IsValidFileUrl(fileUrl)) | ||
| { | ||
| return NotFound(messageService.Get("InvalidFileName")); | ||
| } |
There was a problem hiding this comment.
| builder.HasMany(b => b.RelatedTo).WithOne(rt => rt.Book) | ||
| .HasForeignKey(bt => bt.BookId); | ||
| .HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Restrict); | ||
|
|
||
| builder.HasMany(b => b.RelatedFrom).WithOne(rf => rf.RelatedBook) | ||
| .HasForeignKey(rf => rf.RelatedBookId); | ||
| .HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Restrict); |
There was a problem hiding this comment.
Enforce restricted delete behavior on related book relationships to prevent unintended deletions. This ensures data integrity and prevents accidental data loss[^1].
builder.HasMany(b => b.RelatedTo).WithOne(rt => rt.Book)
.HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Restrict);
builder.HasMany(b => b.RelatedFrom).WithOne(rf => rf.RelatedBook)
.HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Restrict);| var relations = await context.BookRelations | ||
| .Where(br => br.BookId == bookId || br.RelatedBookId == bookId) | ||
| .ToListAsync(ct); | ||
| if (relations.Any()) | ||
| context.BookRelations.RemoveRange(relations); |
There was a problem hiding this comment.
Consider removing related BookRelation entities before deleting a book to maintain data integrity. This prevents orphaned records and ensures consistency in the database[^1].
var relations = await context.BookRelations
.Where(br => br.BookId == bookId || br.RelatedBookId == bookId)
.ToListAsync(ct);
if (relations.Any())
context.BookRelations.RemoveRange(relations);
Summary by CodeRabbit
New Features
Bug Fixes
Refactor
Documentation
Chores