Skip to content

Add s3#96

Merged
hootanht merged 109 commits intodevelopfrom
Add-S3
Jul 22, 2025
Merged

Add s3#96
hootanht merged 109 commits intodevelopfrom
Add-S3

Conversation

@MehdiKh-hub
Copy link
Copy Markdown
Contributor

@MehdiKh-hub MehdiKh-hub commented Jun 12, 2025

Summary by CodeRabbit

  • New Features

    • Added support for file storage and retrieval using S3-compatible services.
    • Introduced new localized error messages for file download issues and validation.
    • Added enum-based configuration for storage buckets and directories.
  • Bug Fixes

    • Improved file upload, deletion, and download logic for reliability and consistency.
    • Enhanced validation for book image URLs.
  • Refactor

    • Streamlined file handling methods and removed obsolete path transformation logic.
    • Updated service registration to use S3 storage by default.
    • Modified book image rendering to use direct URLs with conditional display.
    • Added cascade delete behavior to related entity relationships for data integrity.
    • Enforced restricted delete behavior on related book relationships to prevent unintended deletions.
  • Documentation

    • Expanded and clarified localized resource strings for error handling.
  • Chores

    • Updated project configuration to support AWS S3 SDK and user secrets.
    • Added client-side validation scripts to admin views.

abolfazlshs80 and others added 20 commits June 11, 2025 21:01
Updated the `<Project Sdk="Microsoft.NET.Sdk.Web">` line to include a leading space for consistency. No functional changes were made to the project structure.
- Added IFileUploaderService dependency to BookController.
- Updated DownloadFile method to use file name in route.
- Implemented error handling for file download failures.
- Added [Authorize] attribute to DownloadFile method.
- Removed unused local file path handling code.
Updated the `BookService` class to remove related `BookAuthors` when a `Book` is deleted. This change ensures referential integrity by preventing orphaned `BookAuthors` in the database.
This commit introduces a new asynchronous method `DownloadFileAsync` in the `S3FileUploaderService` class. The method allows for downloading files from an S3 bucket using a specified `fileName`. It handles the creation of a `GetObjectRequest`, retrieves the file, and returns it as a `MemoryStream`. Additionally, it includes error handling for potential `AmazonS3Exception` cases, providing a clear error message if the file is not found or the key is incorrect.
Introduced a new method `DownloadFileAsync` in the
`IFileUploaderService` interface for asynchronous file
download functionality, accepting a `string fileName`
parameter and returning a `Task<Stream>`.
- Added `<UserSecretsId>` to `Refhub.csproj` for user secrets management.
- Updated `appsettings.Development.json` to remove hardcoded AWS `AccessKey` and `SecretKey`, replacing them with empty strings for improved security.
Copilot AI review requested due to automatic review settings June 12, 2025 15:36
@labels-and-badges labels-and-badges Bot added NO JIRA This PR does not have a Jira Ticket PR:size/XL Denotes a Pull Request that changes 500-999 lines. labels Jun 12, 2025
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Jun 12, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

This update migrates file management from local storage to S3-compatible cloud storage (ArvanCloud), introduces new configuration and error handling for file operations, and revises the BookController and related services accordingly. It also enforces URL validation for book images, adds new enums and static helpers for bucket/directory naming, and updates database schema migrations and resource files.

Changes

File(s) / Group Change Summary
Controllers/BookController.cs Refactored to use S3 file service for downloads, improved error handling, and updated method signatures and dependencies.
Service/Implement/S3FileUploaderService.cs, Models/AppSetting/S3Configuration.cs Added new S3 file uploader service and configuration model for S3/ArvanCloud integration.
Service/Implement/LocalFileUploaderService.cs, Service/Interface/IFileUploaderService.cs Removed local file uploader service, unified method signatures, added async download, and updated interface accordingly.
Tools/ExtentionMethod/AddServiceExtentionMethod.cs, Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs Changed DI registration to use S3 service, added extension for S3 config binding.
Tools/Static/FolderNameStatic.cs, Models/Enums/DirectoryTypes.cs Removed static directory fields and obsolete path extension methods; added enum-based directory types.
Tools/Static/BucketNameStatic.cs, Models/Enums/BucketNames.cs Added enum and static helper for bucket names.
Tools/Exceptions/FileDownloadException.cs Introduced custom exception for file download errors.
Service/Implement/BookService.cs Updated to use new bucket naming, improved file operations, and added duplicate slug check.
Data/Configuration/BookAuthorConfiguration.cs Enabled cascade delete for BookAuthor relationships.
Data/Configuration/BookConfiguration.cs Enabled restrict delete for BookRelation foreign keys.
Migrations/20250611081248_fix-mig.* , Migrations/20250612083534_AddUniqueIndexToBookSlugAndModifyType.cs Added/modified migrations for unique slug index and schema changes.
Program.cs, Refhub.csproj Integrated S3 config binding, enabled UserSecrets, added AWS S3 SDK, and changed resource generator.
Resources/Messages.resx, Resources/Messages.fa.resx, Resources/Messages.Designer.cs Added new error/resource strings, made resources public, and updated Farsi translations.
Models/Books/UpdateBookVM.cs Added URL validation for ImagePath with localized error.
Views/Book/BookDetails.cshtml, Views/Book/List.cshtml, Areas/Admin/Views/ManageBook/.cshtml, Components/Views/Home/.cshtml Removed image/file path extension methods, now use raw paths with null checks.
Tools/ExtentionMethod/PathExtionMethod.cs Removed obsolete path extension methods.
appsettings.Development.json Reformatted, removed connection strings.
Tools/Exceptions/EntityNotFoundException.cs Removed trailing blank line.
Areas/Admin/Views/Shared/_ValidationScriptsPartial.cshtml Added partial view for client-side validation scripts.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant BookController
    participant S3FileUploaderService
    participant S3 (ArvanCloud)
    participant Logger
    participant MessageService

    User->>BookController: DownloadFile(fileUrl)
    BookController->>BookController: Validate fileUrl
    alt Invalid fileUrl
        BookController->>User: BadRequest (InvalidFileName)
    else Valid fileUrl
        BookController->>S3FileUploaderService: DownloadFileAsync(fileUrl, ...)
        alt S3 returns file stream
            BookController->>User: File (with content type)
        else FileDownloadException
            BookController->>Logger: Log error
            BookController->>User: NotFound (FileNotFound)
        else Other Exception
            BookController->>Logger: Log error
            BookController->>User: InternalServerError (DownloadError)
        end
    end
Loading

Estimated code review effort

3 (~45 minutes)

Possibly related PRs

  • Add s3 #96: Contains identical changes to BookController constructor and DownloadFile method, including S3 integration and error handling.

Suggested reviewers

  • hootanht

Poem

A hop, a leap, to clouds we go,
Book files now in S3's flow!
With buckets, enums, and paths anew,
Error messages clear and true.
No more local file distress—
This bunny brings you cloud success!
🐇☁️📚


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e7a409b and 5edf84c.

📒 Files selected for processing (1)
  • Refhub/Service/Implement/S3FileUploaderService.cs (1 hunks)
✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

This comment was marked as outdated.

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @MehdiKh-hub, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request implements S3 cloud storage integration, specifically targeting ArvanCloud, to handle file uploads, downloads, and deletions for books and associated files. It replaces the existing local file storage logic by introducing a new S3 service implementation and updating the relevant application code and interfaces. Additionally, an unrelated database migration is included to optimize the Books table slug column.

Highlights

  • S3 Integration: Introduced integration with S3 (specifically configured for ArvanCloud) for file storage, replacing the previous local file storage mechanism.
  • File Uploader Service Update: The IFileUploaderService interface and its implementations (LocalFileUploaderService and the new S3FileUploaderService) were updated to support S3 operations, including a new DownloadFileAsync method and a simplified DeleteFile signature.
  • Dependency Injection Switch: The application's dependency injection configuration was updated to use the new S3FileUploaderService as the concrete implementation for IFileUploaderService.
  • Book File Handling Logic: The BookService and BookController were updated to utilize the new S3-compatible file uploader service methods and handle file paths/URLs appropriate for S3.
  • Database Migration: A new database migration was added to change the Slug column in the Books table to nvarchar(450) and add a unique index on it.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configureGemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces S3 integration for file storage. Key changes include a new S3FileUploaderService, updates to controllers and services to use S3, and related configuration and model adjustments. Several critical and high-priority issues need addressing, including the S3 URL generation logic, error handling in the BookController, ensuring data integrity during book deletion, and completing the implementation of LocalFileUploaderService.

Comment thread Refhub/Service/Implement/BookService.cs Outdated
await uploaderService.DeleteFile(FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.FilePath);
await uploaderService.DeleteFile( book.FilePath);
}
context.BookAuthors.RemoveRange(book.BookAuthors);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The line context.BookAuthors.RemoveRange(book.BookAuthors); is likely redundant. Also, the BookRelation entity is configured with OnDelete(DeleteBehavior.Restrict), which is not handled in DeleteBookAsync and will lead to runtime errors. Either prevent deletion of books with active relations, or explicitly remove/update these relations before deleting the book.

// Consider removing this line if cascade delete for BookAuthors is correctly configured and working.
// context.BookAuthors.RemoveRange(book.BookAuthors);

// CRITICAL: Handle BookRelations before deleting the book due to DeleteBehavior.Restrict
// Example: Load and remove related BookRelations
// var relations = await context.BookRelations
//     .Where(br => br.BookId == bookId || br.RelatedBookId == bookId)
//     .ToListAsync(ct);
// if (relations.Any()) context.BookRelations.RemoveRange(relations);
// Ensure BookKeywords are also handled if not covered by cascade delete (they appear to be).

private string GenerateS3Url(string key)
{
// برای آروان کلاد:
return $"https://{_bucketName}.{_region}/{key}";
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The S3 URL generated here (https://{_bucketName}.{_region}/{key}) follows a virtual-hosted style format, but the S3 client is configured with ForcePathStyle = true. Verify ArvanCloud's S3 API documentation regarding ForcePathStyle and the correct format for public object URLs to ensure consistency and correctness.

Comment thread Refhub/Controllers/BookController.cs Outdated
Comment on lines +39 to +42
public Task DeleteFile(string realUrl)
{
throw new NotImplementedException();
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The DeleteFile(string realUrl) method is implemented by throwing a NotImplementedException. If LocalFileUploaderService is intended to be a functional alternative, this method must be properly implemented.

Comment thread Refhub/Controllers/BookController.cs Outdated
return NotFound();
}
var stream = await _s3FileUploaderService.DownloadFileAsync(fileName);
var contentType = "application/octet-stream"; // یا محتوای واقعی بر اساس پسوند
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The contentType is hardcoded. Consider determining the content type dynamically based on the file extension for a better user experience.

// Example: inject FileExtensionContentTypeProvider or create an instance
// private readonly FileExtensionContentTypeProvider _contentTypeProvider = new FileExtensionContentTypeProvider();
if (!new Microsoft.AspNetCore.StaticFiles.FileExtensionContentTypeProvider().TryGetContentType(fileName, out var contentType))
{
    contentType = "application/octet-stream"; // Fallback
}

Comment thread Refhub/Tools/Static/FolderNameStatic.cs Outdated
DirectoryTypes.Books => nameof(DirectoryTypes.Books),
DirectoryTypes.Files => nameof(DirectoryTypes.Files),

_ => "Order"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The default case _ => "Order" in the switch expression for GetDirectoryName could lead to unexpected behavior. Consider throwing an ArgumentOutOfRangeException for unhandled enum values to make such issues explicit.

            _ => throw new System.ArgumentOutOfRangeException(nameof(folder), $"Unsupported directory type: {folder}")

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

🔭 Outside diff range comments (4)
Refhub/Tools/ExtentionMethod/PathExtionMethod.cs (1)

5-15: 🛠️ Refactor suggestion

Identity methods & naming inconsistencies undermine the utility of this helper.

• Both ConvertForBookPathImage and ConvertForBookPathFile are now pure identity functions; they add no value and invite confusion.
• Class/extension suffix is misspelled (ExtionMethodExtensionMethod).
Consider deleting the extensions entirely and inlining the call-sites, or restore the previous path-building logic behind the new DirectoryTypes abstraction.

-public static string ConvertForBookPathImage(this string imageName)
-{
-    return imageName;
-}
-public static string ConvertForBookPathFile(this string fileName)
-{
-    return fileName;
-}
+// Remove these two methods and update call-sites accordingly,
+// or re-implement them to return $"{DirectoryTypes.BookImages}/{imageName}" etc.
Refhub/Service/Implement/BookService.cs (2)

208-216: ⚠️ Potential issue

Delete happens before successful re-upload – risk of permanent data loss

In the update path you

await uploaderService.DeleteFile(existingBook.FilePath);
// …
existingBook.FilePath = await uploaderService.UploadFile(...)

If the new upload fails, the original file is already gone.
Swap the order (upload → verify → delete old) or perform a copy-on-write approach.

-await uploaderService.DeleteFile(existingBook.FilePath);
-existingBook.FilePath = await uploaderService.UploadFile(...);
+var newPath = await uploaderService.UploadFile(...);
+if (!string.IsNullOrWhiteSpace(newPath))
+{
+    await uploaderService.DeleteFile(existingBook.FilePath);
+    existingBook.FilePath = newPath;
+}

256-272: ⚠️ Potential issue

BookAuthors not eagerly loaded – possible NullReferenceException and missing cascade delete

book is fetched without .Include(b => b.BookAuthors), yet RemoveRange(book.BookAuthors) is invoked.
When lazy-loading is disabled (the default in EF Core) this will throw; when enabled it will incur N+1.
Either:

var book = await context.Books
        .Include(b => b.BookAuthors)
        .FirstOrDefaultAsync(b => b.Id == bookId, ct);

or rely on cascade delete and drop the explicit RemoveRange.

Refhub/Service/Implement/LocalFileUploaderService.cs (1)

7-28: 🛠️ Refactor suggestion

Method marked async but performs only sync IO

UploadFile:

  1. Uses synchronous FileStream & CopyTo, blocking the thread.
  2. Never awaits – the async modifier only adds overhead.

Either switch to truly async APIs or drop async/await.

-public async Task<string> UploadFile(...)
+public Task<string> UploadFile(...)
 {
-    ...
-    using (Stream stream = new FileStream(path, FileMode.Create))
-    {
-        file.CopyTo(stream);
-    }
-    return name;
+    await using var stream = new FileStream(path, FileMode.Create, FileAccess.Write, FileShare.None, 4096, useAsync:true);
+    await file.CopyToAsync(stream);
+    return name;
 }

Also, new Random() per call is unsafe in parallel code; use Guid.NewGuid() or RandomNumberGenerator.

🧹 Nitpick comments (13)
Refhub/Models/Enums/DirectoryTypes.cs (1)

3-9: Enum naming & documentation could be tightened

  1. Consider adding XML‐doc comments for the enum itself and each value – consumers (especially in Swagger) will then see meaningful descriptions.
  2. “Files” is very generic; if it is acting as a catch-all, document that intent explicitly.
  3. If the numeric value 0 matters only for None, explicitly set the rest as well for readability:
         None = 0,
-        Images,
-        Books,
-        Files,
+        Images = 1,
+        Books  = 2,
+        Files  = 3,
Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (2)

1-4: Unused using directives

Refhub.Service.Implement and Refhub.Service.Interface are not referenced in this file – remove them to keep compile-time warnings quiet.


9-14: Leverage Options pattern with validation

Current binding registers plain IOptions<S3Configuration> without validation. Using AddOptions enables automatic checks:

-    builder.Services.Configure<S3Configuration>(
-        builder.Configuration.GetSection("AWS:S3"));
+    builder.Services.AddOptions<S3Configuration>()
+           .Bind(builder.Configuration.GetSection("AWS:S3"))
+           .ValidateDataAnnotations()
+           .Validate(config => !string.IsNullOrWhiteSpace(config.ServiceURL),
+                     "ServiceURL must not be empty in non-AWS environments.");

This will fail fast at startup for broken configs instead of at first use.

Refhub/Refhub.csproj (1)

16-16: Isolate AWS SDK to where it is needed

If S3 usage is confined to one project, mark the package reference with <PrivateAssets>all</PrivateAssets> to avoid leaking to consuming projects in a solution.

-    <PackageReference Include="AWSSDK.S3" Version="4.0.1.4" />
+    <PackageReference Include="AWSSDK.S3" Version="4.0.1.4">
+      <PrivateAssets>all</PrivateAssets>
+    </PackageReference>
Refhub/Tools/ExtentionMethod/PathExtionMethod.cs (1)

1-1: Remove now-unused Refhub.Tools.Static import.

The FolderNameStatic helper is no longer referenced after the refactor; keep using directives minimal to avoid unnecessary dependencies.

Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (1)

2-3: Namespace and folder structure look off.

Refhub.Service.Implement.S3_Sample.Service is overly nested and redundant (Service appears twice, plus the “S3_Sample” folder name leaks an implementation detail).
Prefer something like Refhub.Service.Implement.S3 or move the class beside the other implementers.

Refhub/Migrations/20250611081248_fix-mig.cs (1)

8-9: Rename migration class to PascalCase to silence CS8981.

fixmig violates C# naming conventions and produces compiler warnings. Rename to FixMig (and update the file name accordingly).

🧰 Tools
🪛 GitHub Actions: 🚀 Build Pipeline

[warning] 8-8: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.

Refhub/Tools/Static/FolderNameStatic.cs (1)

7-17: Default case returns magic string "Order"

DirectoryTypes.None (or any future enum value) silently maps to "Order", which is misleading and brittle.
Consider either throwing ArgumentOutOfRangeException or adding an explicit Order member to the enum for clarity.

Refhub/Service/Interface/IFileUploaderService.cs (1)

5-9: Parameter casing & naming consistency

string Type violates .NET naming conventions (type, category, etc.) while other parameters are camel-cased.
Minor, but keeping the API surface consistent aids discoverability.

Refhub/Controllers/BookController.cs (1)

3-4: Avoid referencing concrete S3 implementation in controller

using Refhub.Service.Implement.S3_Sample.Service; is unnecessary; the controller depends only on the interface.
Keeping controllers free of implementation namespaces prevents accidental tight coupling.

Refhub/Service/Implement/S3FileUploaderService.cs (2)

51-56: GetKey relies on brittle string replace

Direct string replacement will fail for mixed-case URLs, query strings, or when the service URL pattern changes.

Consider parsing via Uri:

-private string GetKey(string realUrl)
-{
-    return realUrl.Replace($"https://{_bucketName}.{_region}/","");
-}
+private static string GetKey(string realUrl)
+{
+    var uri = new Uri(realUrl);
+    return uri.AbsolutePath.TrimStart('/');
+}

57-75: Sanitise name before building the S3 key

Only spaces are replaced; special characters (/, \, ?, …) will break S3 keys or allow directory traversal inside the bucket.

Minimum hardening:

-var key = $"{directoryName}/{type}/{name.Replace(" ","-")}{Path.GetExtension(file.FileName)}";
+var safeName = Path.GetFileNameWithoutExtension(name)
+                  .Replace(" ", "-")
+                  .Replace("/", "-")
+                  .Replace("\\", "-");
+var key = $"{directoryName}/{type}/{safeName}{Path.GetExtension(file.FileName)}";
Refhub/Migrations/20250611081248_fix-mig.Designer.cs (1)

15-17: Migration class name triggers compiler warning

partial class fixmig violates the C# naming rules (CS8981).
Rename the migration (e.g., FixMig) and regenerate to avoid future keyword conflicts and improve readability.

🧰 Tools
🪛 GitHub Actions: 🚀 Build Pipeline

[warning] 16-16: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ba432a1 and dc3a765.

📒 Files selected for processing (17)
  • Refhub/Controllers/BookController.cs (3 hunks)
  • Refhub/Migrations/20250611081248_fix-mig.Designer.cs (1 hunks)
  • Refhub/Migrations/20250611081248_fix-mig.cs (1 hunks)
  • Refhub/Migrations/AppDbContextModelSnapshot.cs (3 hunks)
  • Refhub/Models/AppSetting/S3Configuration.cs (1 hunks)
  • Refhub/Models/Enums/DirectoryTypes.cs (1 hunks)
  • Refhub/Program.cs (1 hunks)
  • Refhub/Refhub.csproj (2 hunks)
  • Refhub/Service/Implement/BookService.cs (4 hunks)
  • Refhub/Service/Implement/LocalFileUploaderService.cs (2 hunks)
  • Refhub/Service/Implement/S3FileUploaderService.cs (1 hunks)
  • Refhub/Service/Interface/IFileUploaderService.cs (1 hunks)
  • Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (2 hunks)
  • Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1 hunks)
  • Refhub/Tools/ExtentionMethod/PathExtionMethod.cs (1 hunks)
  • Refhub/Tools/Static/FolderNameStatic.cs (1 hunks)
  • Refhub/appsettings.Development.json (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1)
Refhub/Models/AppSetting/S3Configuration.cs (1)
  • S3Configuration (3-10)
Refhub/Service/Implement/BookService.cs (1)
Refhub/Tools/Static/FolderNameStatic.cs (2)
  • FolderNameStatic (5-18)
  • GetDirectoryName (7-17)
Refhub/Service/Implement/S3FileUploaderService.cs (1)
Refhub/Models/AppSetting/S3Configuration.cs (1)
  • S3Configuration (3-10)
🪛 GitHub Actions: 🚀 Build Pipeline
Refhub/Migrations/20250611081248_fix-mig.cs

[warning] 8-8: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.

Refhub/Service/Implement/LocalFileUploaderService.cs

[error] 5-5: CS0535: 'LocalFileUploaderService' does not implement interface member 'IFileUploaderService.DownloadFileAsync(string)'.

Refhub/Migrations/20250611081248_fix-mig.Designer.cs

[warning] 16-16: CS8981: The type name 'fixmig' only contains lower-cased ascii characters. Such names may become reserved for the language.

🔇 Additional comments (6)
Refhub/Refhub.csproj (1)

6-8: Confirm .NET 9 target & AWS SDK version compatibility

Targeting net9.0 (currently preview) with AWSSDK.S3 4.0.1.4 is bold; verify runtime compatibility before production deployment. Consider pinning to LTS (net8.0) unless a preview is intentional.

Refhub/Program.cs (1)

24-28: Bind after configuration sources are added

builder.BindS3Model() runs before the extra AddJsonFile(...) calls. While the default files are already loaded, any values coming from the additional environment-specific file may be missed. Safer order:

-#region CustomExtentionMethod 
-builder.BindS3Model();
-builder.Services.AddCustomService();
+// Additional config sources
+builder.Configuration
+       .SetBasePath(Directory.GetCurrentDirectory())
+       .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
+       .AddJsonFile($"appsettings.{builder.Environment.EnvironmentName}.json", optional: true)
+       .AddEnvironmentVariables();
+
+// Bind after all sources are in place
+builder.BindS3Model();
+
+builder.Services.AddCustomService();
Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (1)

18-19: Verify lifetime & configuration of S3FileUploaderService.

AddScoped is fine if the service is stateless. If it wraps an IAmazonS3 client, that client should be a singleton to avoid socket exhaustion; ensure the service internally resolves a singleton client from DI instead of constructing per request.

Also double-check that S3Configuration is bound before this registration (startup order matters).

Refhub/Migrations/AppDbContextModelSnapshot.cs (1)

20-21: Auto-generated file – no manual changes required.

Snapshot reflects the new unique index and updated column type. Looks correct.

Refhub/Service/Implement/BookService.cs (1)

157-159: Confirm uploader returns a globally unique key, not just a local filename

UploadFile is invoked with enum-based folder names, which is 👍.
However, both current S3 and local implementations appear to return only the generated file name (no prefix / folder).
If you later compose a presigned URL or need to address the object in S3, a mere filename will collide across folders/buckets.
Please verify the contract – it should probably return the full object key (or absolute URL) instead.

Refhub/Service/Implement/S3FileUploaderService.cs (1)

31-42: Region handling is inconsistent

You hard-code RegionEndpoint.USEast1 while separately storing _region from configuration and later building URLs from it.
If the bucket actually lives outside us-east-1, requests succeed only because a custom ServiceURL is supplied; signing/URL generation may still break.

Either:

  1. Pick the correct enum from RegionEndpoint.GetBySystemName(_s3Options.Value.Region) (preferred), or
  2. Omit RegionEndpoint when ServiceURL is set.

Comment thread Refhub/Models/AppSetting/S3Configuration.cs Outdated
Comment thread Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs Outdated
Comment thread Refhub/appsettings.Development.json Outdated
Comment on lines +2 to +10
"AWS": {
"S3": {
"Region": "s3.ir-thr-at1.arvanstorage.ir",
"AccessKey": "",
"SecretKey": "",
"BucketName": "test12333555",
"ServiceURL": "https://s3.ir-thr-at1.arvanstorage.ir/"
}
},
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Avoid committing secrets & clarify S3 settings.

AccessKey / SecretKey must never be committed with real values; rely on environment variables or dotnet user-secrets in local dev.
Region currently contains a full endpoint. Conventionally this should be the region code (ir-thr-at1) while the endpoint/ServiceURL carries the scheme & host. Mislabeling may break SDK region resolution.

"AWS": {
  "S3": {
    "Region": "ir-thr-at1",
    "ServiceURL": "https://s3.ir-thr-at1.arvanstorage.ir",
    "AccessKey": "${AWS_ACCESS_KEY_ID}",
    "SecretKey": "${AWS_SECRET_ACCESS_KEY}"
  }
}
🤖 Prompt for AI Agents
In Refhub/appsettings.Development.json lines 2 to 10, do not commit real AWS
AccessKey and SecretKey values; instead, replace them with placeholders or
environment variable references to avoid exposing secrets. Also, update the
Region field to only contain the region code (e.g., "ir-thr-at1") rather than
the full endpoint URL, and ensure the ServiceURL field contains the full scheme
and host URL. This aligns with AWS SDK conventions and prevents region
resolution issues.

Comment on lines +13 to +26
migrationBuilder.AlterColumn<string>(
name: "Slug",
table: "Books",
type: "nvarchar(450)",
nullable: false,
oldClrType: typeof(string),
oldType: "nvarchar(max)");

migrationBuilder.CreateIndex(
name: "IX_Books_Slug",
table: "Books",
column: "Slug",
unique: true);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

❓ Verification inconclusive

Unique index may fail on existing duplicate slugs.

The migration assumes all current Books.Slug values are unique and ≤ 450 chars. If either condition is violated the deploy will fail.

Recommend adding a data-cleanup step (or a pre-deployment check) before applying the migration:

// Example check
if (migrationBuilder.Sql("SELECT Slug, COUNT(*) FROM Books GROUP BY Slug HAVING COUNT(*) > 1").Any())
    throw new InvalidOperationException("Duplicate slugs detected; manual cleanup required.");

Add pre-migration checks for duplicate or over‐length slugs

Creating a unique index on Books.Slug and shrinking it to nvarchar(450) will fail if there are existing duplicate values or slugs longer than 450 characters. Please add validation steps in the Up() method before altering the column:

• File: Refhub/Migrations/20250611081248_fix-mig.cs
Location: inside protected override void Up(MigrationBuilder migrationBuilder), before the AlterColumn call.

Suggested patch:

 public partial class fix_mig : Migration
 {
     protected override void Up(MigrationBuilder migrationBuilder)
     {
+        // Prevent migration if there are duplicate slugs
+        migrationBuilder.Sql(@"
+            IF EXISTS (
+                SELECT Slug
+                FROM Books
+                GROUP BY Slug
+                HAVING COUNT(*) > 1
+            )
+            THROW 51000, 'Duplicate slugs detected; manual cleanup required.', 1;
+        ");
+
+        // Prevent migration if any slug exceeds 450 characters
+        migrationBuilder.Sql(@"
+            IF EXISTS (
+                SELECT 1
+                FROM Books
+                WHERE LEN(Slug) > 450
+            )
+            THROW 51000, 'Slugs longer than 450 characters detected; manual cleanup required.', 1;
+        ");
         migrationBuilder.AlterColumn<string>(
             name: "Slug",
             table: "Books",
             type: "nvarchar(450)",
             nullable: false,
             oldClrType: typeof(string),
             oldType: "nvarchar(max)");

         migrationBuilder.CreateIndex(
             name: "IX_Books_Slug",
             table: "Books",
             column: "Slug",
             unique: true);
     }
 }

Alternatively, run a one-off cleanup script or pre-deployment job to remove duplicates and truncate or reject over-long slugs.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
migrationBuilder.AlterColumn<string>(
name: "Slug",
table: "Books",
type: "nvarchar(450)",
nullable: false,
oldClrType: typeof(string),
oldType: "nvarchar(max)");
migrationBuilder.CreateIndex(
name: "IX_Books_Slug",
table: "Books",
column: "Slug",
unique: true);
}
public partial class fix_mig : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
// Prevent migration if there are duplicate slugs
migrationBuilder.Sql(@"
IF EXISTS (
SELECT Slug
FROM Books
GROUP BY Slug
HAVING COUNT(*) > 1
)
THROW 51000, 'Duplicate slugs detected; manual cleanup required.', 1;
");
// Prevent migration if any slug exceeds 450 characters
migrationBuilder.Sql(@"
IF EXISTS (
SELECT 1
FROM Books
WHERE LEN(Slug) > 450
)
THROW 51000, 'Slugs longer than 450 characters detected; manual cleanup required.', 1;
");
migrationBuilder.AlterColumn<string>(
name: "Slug",
table: "Books",
type: "nvarchar(450)",
nullable: false,
oldClrType: typeof(string),
oldType: "nvarchar(max)");
migrationBuilder.CreateIndex(
name: "IX_Books_Slug",
table: "Books",
column: "Slug",
unique: true);
}
protected override void Down(MigrationBuilder migrationBuilder)
{
// ... existing rollback logic ...
}
}
🤖 Prompt for AI Agents
In Refhub/Migrations/20250611081248_fix-mig.cs around lines 13 to 26, before
altering the Slug column and creating the unique index, add a pre-migration
validation step in the Up() method to check for duplicate Slug values and any
Slugs longer than 450 characters. Use migrationBuilder.Sql to query for
duplicates and length violations, and if any are found, throw an
InvalidOperationException to prevent the migration from proceeding until manual
cleanup is done. This ensures the unique index creation and column alteration
will not fail due to existing data issues.

Comment thread Refhub/Service/Implement/LocalFileUploaderService.cs Outdated
Comment thread Refhub/Controllers/BookController.cs Outdated
Comment on lines +1 to +4
using Microsoft.Extensions.Options;
using Refhub.Models.AppSetting;
using Refhub.Service.Interface;

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Missing core using directives – code will not compile

IFormFile, Path, and MemoryStream are referenced but Microsoft.AspNetCore.Http and System.IO are not imported.
Compilation will fail with CS0246 / CS0103.

-using Microsoft.Extensions.Options;
+using System.IO;
+using Microsoft.AspNetCore.Http;
+using Microsoft.Extensions.Options;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
using Microsoft.Extensions.Options;
using Refhub.Models.AppSetting;
using Refhub.Service.Interface;
using System.IO;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Options;
using Refhub.Models.AppSetting;
using Refhub.Service.Interface;
🤖 Prompt for AI Agents
In Refhub/Service/Implement/S3FileUploaderService.cs at the top of the file
(lines 1 to 4), add the missing using directives for Microsoft.AspNetCore.Http
to resolve IFormFile, and System.IO to resolve Path and MemoryStream. This will
fix the compilation errors CS0246 and CS0103 caused by undefined types.

Comment thread Refhub/Service/Implement/S3FileUploaderService.cs
Implement cascade delete behavior for BookId and AuthorId
in BookAuthorConfiguration to ensure related entries are
automatically removed when a Book or Author is deleted.
Updated the `BookService` class to support file uploads for both book files and images using `IFileUploaderService`. Added validation for upload paths.

Improved the book deletion method to load and remove related `BookAuthors` and `BookRelations`, ensuring all associated entities are handled correctly during deletion.
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🔭 Outside diff range comments (1)
Refhub/appsettings.Development.json (1)

1-10: Add missing AWS:S3 configuration

The code registers S3Configuration from the AWS:S3 section, but no such section is defined in any JSON/YAML settings. You must either add it to your environment-specific config file(s) or supply the values via environment variables.

• Location requiring attention:

  • Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (calls builder.Configuration.GetSection("AWS:S3"))

• Recommended JSON snippet (e.g. in appsettings.Development.json):

{
  "Logging": { },
  "AWS": {
    "S3": {
      "Region":     "us-east-1",
      "AccessKey":  "<YourAccessKey>",
      "SecretKey":  "<YourSecretKey>",
      "BucketName": "<YourBucketName>"
    }
  }
}

Alternatively, set these via environment variables using the double-underscore convention:

  • AWS__S3__Region
  • AWS__S3__AccessKey
  • AWS__S3__SecretKey
  • AWS__S3__BucketName
♻️ Duplicate comments (15)
Refhub/Tools/Static/BucketNameStatic.cs (2)

7-17: LGTM! Clean enum-to-string mapping implementation.

The static class provides a clean, type-safe way to convert BucketNames enum values to their string representations. The switch expression with nameof() is appropriate and the exception handling is correct.

The past review comment about the exception message has been addressed - the current code correctly references "bucket name" rather than "directory type".


15-15: Fix the exception message to reference bucket names instead of directory type.

The exception message incorrectly refers to "directory type" when this method handles bucket names.

Refhub/Areas/Admin/Views/ManageBook/Update.cshtml (1)

50-53: LGTM! Proper conditional rendering implemented.

The conditional check @if (!string.IsNullOrEmpty(Model.ImagePath)) correctly prevents broken image links. This properly addresses the removal of the ConvertForBookPathImage() extension method.

The past review comment about URL validation at the model level and database migration verification remains valid but is outside the scope of this view file.

Refhub/Data/Configuration/BookAuthorConfiguration.cs (1)

13-22: LGTM! Cascade delete behavior properly implemented.

The addition of OnDelete(DeleteBehavior.Cascade) to both foreign key relationships ensures proper cleanup of BookAuthor entries when either a Book or Author is deleted. This maintains referential integrity and reduces the need for manual cleanup in service layers.

This addresses the previous review suggestions about implementing cascade delete behavior for consistency across the application.

Refhub/Components/Views/Home/LastBookView.cshtml (1)

28-31: Image validation is properly implemented.

The conditional rendering with null/empty check for ImagePath correctly prevents broken image tags, which addresses the S3 migration requirements.

Refhub/Migrations/20250611081248_fix-mig.cs (1)

13-26: Add pre-migration validation for data integrity.

Creating a unique index on Slug and reducing the column size to nvarchar(450) will fail if there are existing duplicate values or slugs longer than 450 characters in the database.

Refhub/Models/Books/UpdateBookVM.cs (1)

31-32: LGTM! URL validation properly implemented.

The [Url] attribute correctly validates that ImagePath contains a valid URL, which aligns with the S3-based storage architecture. The localized error message using the Messages resource class is appropriate.

Refhub/Service/Implement/BookService.cs (3)

153-154: Slug existence check properly maintained.

The slug existence check using AnyAsync is correctly implemented to prevent duplicate slugs and uses async operations as recommended in previous reviews.


259-260: Query optimization implemented.

The simplified query without unnecessary .Include(b => b.BookAuthors) improves performance as recommended in previous reviews, since cascade delete handles the relationship automatically.


276-280: Critical issue resolved: BookRelations handling added.

The implementation now properly handles BookRelations before deleting a book, addressing the critical issue from previous reviews where DeleteBehavior.Restrict would cause runtime errors.

Refhub/Service/Interface/IFileUploaderService.cs (1)

7-7: Interface correctly defines non-nullable bucketName parameter.

The bucketName parameter is properly defined as non-nullable string, which creates a stronger contract as suggested in previous reviews.

Refhub/Controllers/BookController.cs (1)

13-13: Fix parameter naming to follow C# conventions.

Primary constructor parameters should use camelCase naming without underscores.

-public class BookController(IBookService bookService, IFileUploaderService s3FileUploaderService, ILogger<BookController> logger, IMessageService messageService) : Controller
+public class BookController(IBookService bookService, IFileUploaderService fileUploaderService, ILogger<BookController> logger, IMessageService messageService) : Controller

Note: You'll also need to update all references to these parameters in the controller methods to use the new names without underscores.

Refhub/Resources/Messages.Designer.cs (1)

1-25: Don't manually edit auto-generated files - configure the generator instead.

This file is auto-generated and manual changes will be lost upon regeneration. To make the resources public, configure the resource generator properly:

  1. In the .resx file properties, change Custom Tool from ResXFileCodeGenerator to PublicResXFileCodeGenerator
  2. Or add to your .csproj:
<PropertyGroup>
  <PublicResXFileCodeGeneratorTargets>true</PublicResXFileCodeGeneratorTargets>
</PropertyGroup>
Refhub/Service/Implement/S3FileUploaderService.cs (2)

121-121: Use English for code comments.

For consistency and international collaboration, use English for all code comments.

-// مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود
+// For example, if the file doesn't exist or the key is incorrect

1-4: Add missing using directives for compilation.

The code references IFormFile and Path but is missing required imports.

 using Microsoft.Extensions.Options;
+using System.IO;
+using Microsoft.AspNetCore.Http;
 using Refhub.Models.AppSetting;
 using Refhub.Service.Interface;
🧹 Nitpick comments (9)
Refhub/Program.cs (1)

24-30: Move BindS3Model() after configuration sources to avoid stale binding

builder.BindS3Model() executes before the extra AddJsonFile(...) providers are registered.
If those JSON files (or env-vars) override the "AWS:S3" section, the bound S3Configuration will not pick them up.

-#region CustomExtentionMethod 
-builder.BindS3Model();
-builder.Services.AddCustomService();
-
-builder.Configuration
-            .SetBasePath(Directory.GetCurrentDirectory())
-            .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
-            .AddJsonFile($"appsettings.{builder.Environment.EnvironmentName}.json", optional: true)
-            .AddEnvironmentVariables();
+#region CustomExtentionMethod 
+builder.Configuration
+        .SetBasePath(Directory.GetCurrentDirectory())
+        .AddJsonFile("appsettings.json",             optional:false, reloadOnChange:true)
+        .AddJsonFile($"appsettings.{builder.Environment.EnvironmentName}.json", optional:true)
+        .AddEnvironmentVariables();
+
+builder.BindS3Model();
+builder.Services.AddCustomService();

Re-ordering guarantees the latest values are bound.

Refhub/Views/Book/BookDetails.cshtml (1)

24-26: Url.Content not needed for absolute S3 URLs

Url.Content() is only useful for virtual paths.
If Model.ImagePath already contains a full URL, output it directly:

-<img src="@Url.Content(Model.ImagePath)" … />
+<img src="@Model.ImagePath" … />
Refhub/Areas/Admin/Views/ManageBook/Index.cshtml (1)

1-3: Remove unused Refhub.Tools.ExtensionMethod import

The view no longer uses any extension methods from that namespace; the @using directive can be dropped.

Refhub/Models/Enums/BucketNames.cs (1)

7-7: Fix formatting inconsistency.

Remove the extra whitespace before BookPdf to maintain consistent indentation.

-       BookPdf,
+        BookPdf,
Refhub/Data/Configuration/BookConfiguration.cs (2)

30-30: Remove duplicate semicolon.

There's a double semicolon at the end of the line that should be removed.

-            .HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Cascade); ;
+            .HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Cascade);

36-36: Remove duplicate semicolon.

There's a double semicolon at the end of the line that should be removed.

-            .HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Cascade); ;
+            .HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Cascade);
Refhub/Controllers/BookController.cs (1)

65-65: Add missing space after comma.

-return StatusCode(500,messageService.Get("DownloadError"));
+return StatusCode(500, messageService.Get("DownloadError"));
Refhub/Service/Implement/S3FileUploaderService.cs (2)

37-37: Remove commented out code.

If _bucketName field is not needed, remove this commented line entirely.

-//_bucketName = s3Options.Value.BucketName;

32-32: Replace Persian comments with English.

For consistency and better collaboration, use English for all code comments.

-ForcePathStyle = true // بسیار مهم برای کار با آروان‌کلاد
+ForcePathStyle = true // Very important for working with ArvanCloud

-// برای آروان کلاد:
+// For ArvanCloud:

-// برای آروان کلاد:
+// For ArvanCloud:

Also applies to: 41-41, 48-48

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c64a2d9 and 6544595.

📒 Files selected for processing (33)
  • Refhub/Areas/Admin/Views/ManageBook/Index.cshtml (1 hunks)
  • Refhub/Areas/Admin/Views/ManageBook/Update.cshtml (1 hunks)
  • Refhub/Components/Views/Home/BestBookView.cshtml (1 hunks)
  • Refhub/Components/Views/Home/LastBookView.cshtml (1 hunks)
  • Refhub/Controllers/BookController.cs (3 hunks)
  • Refhub/Data/Configuration/BookAuthorConfiguration.cs (1 hunks)
  • Refhub/Data/Configuration/BookConfiguration.cs (1 hunks)
  • Refhub/Migrations/20250611081248_fix-mig.Designer.cs (1 hunks)
  • Refhub/Migrations/20250611081248_fix-mig.cs (1 hunks)
  • Refhub/Migrations/20250612083534_AddUniqueIndexToBookSlugAndModifyType.cs (1 hunks)
  • Refhub/Models/AppSetting/S3Configuration.cs (1 hunks)
  • Refhub/Models/Books/UpdateBookVM.cs (2 hunks)
  • Refhub/Models/Enums/BucketNames.cs (1 hunks)
  • Refhub/Models/Enums/DirectoryTypes.cs (1 hunks)
  • Refhub/Program.cs (2 hunks)
  • Refhub/Refhub.csproj (1 hunks)
  • Refhub/Resources/Messages.Designer.cs (34 hunks)
  • Refhub/Resources/Messages.fa.resx (2 hunks)
  • Refhub/Resources/Messages.resx (1 hunks)
  • Refhub/Service/Implement/BookService.cs (5 hunks)
  • Refhub/Service/Implement/LocalFileUploaderService.cs (0 hunks)
  • Refhub/Service/Implement/S3FileUploaderService.cs (1 hunks)
  • Refhub/Service/Interface/IFileUploaderService.cs (1 hunks)
  • Refhub/Tools/Exceptions/EntityNotFoundException.cs (0 hunks)
  • Refhub/Tools/Exceptions/FileDownloadException.cs (1 hunks)
  • Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (2 hunks)
  • Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1 hunks)
  • Refhub/Tools/ExtentionMethod/PathExtionMethod.cs (0 hunks)
  • Refhub/Tools/Static/BucketNameStatic.cs (1 hunks)
  • Refhub/Tools/Static/FolderNameStatic.cs (1 hunks)
  • Refhub/Views/Book/BookDetails.cshtml (2 hunks)
  • Refhub/Views/Book/List.cshtml (1 hunks)
  • Refhub/appsettings.Development.json (1 hunks)
💤 Files with no reviewable changes (3)
  • Refhub/Tools/ExtentionMethod/PathExtionMethod.cs
  • Refhub/Tools/Exceptions/EntityNotFoundException.cs
  • Refhub/Service/Implement/LocalFileUploaderService.cs
🧰 Additional context used
🧬 Code Graph Analysis (3)
Refhub/Data/Configuration/BookAuthorConfiguration.cs (1)
Refhub/Data/Models/Author.cs (1)
  • Author (3-12)
Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1)
Refhub/Models/AppSetting/S3Configuration.cs (1)
  • S3Configuration (8-15)
Refhub/Migrations/20250611081248_fix-mig.Designer.cs (2)
Refhub/Migrations/AppDbContextModelSnapshot.cs (1)
  • DbContext (13-541)
Refhub/Migrations/20250611081248_fix-mig.cs (1)
  • fixmig (8-43)
🔇 Additional comments (32)
Refhub/Components/Views/Home/BestBookView.cshtml (2)

24-27: LGTM! Proper null validation for image rendering.

The conditional check @if (!string.IsNullOrEmpty(item.ImagePath)) correctly prevents broken image links when ImagePath is null or empty. This addresses the concern from the previous review about removing the ConvertForBookPathImage() extension method.


24-27: LGTM! Proper null check prevents broken images.

The conditional rendering correctly prevents broken images when ImagePath is null or empty, which addresses the removal of the extension method.

Refhub/Views/Book/List.cshtml (1)

103-107: Null-check guard looks good

The conditional rendering prevents broken images after the S3 migration.
No further issues spotted.

Refhub/Tools/ExtentionMethod/AddServiceExtentionMethod.cs (1)

17-20: Verify service prerequisites for S3FileUploaderService

Registration switch is correct.
Please confirm that:

  1. BindS3Model() is executed (see Program.cs), and
  2. All constructor dependencies of S3FileUploaderService (e.g., IOptions<S3Configuration>, IAmazonS3) are registered.

Otherwise the app will fail at startup with InvalidOperationException.

Refhub/Areas/Admin/Views/ManageBook/Index.cshtml (1)

51-56: Image null-check added — good catch

Prevents broken thumbnails in the admin grid.

Refhub/Models/Enums/DirectoryTypes.cs (1)

3-10: Well-designed enum implementation.

The enum follows best practices with explicit zero value, descriptive names, and consistent formatting. Good foundation for strongly-typed directory management.

Refhub/Data/Configuration/BookConfiguration.cs (1)

29-30: Cascade delete behavior is appropriate.

The cascade delete configuration for RelatedTo and RelatedFrom relationships ensures referential integrity when books are deleted, which is the correct behavior for these dependent relationships.

Also applies to: 35-36

Refhub/Tools/Exceptions/FileDownloadException.cs (1)

1-13: LGTM! Well-structured custom exception class.

The implementation follows standard .NET exception patterns with proper inheritance and constructor overloads for both message-only and message+inner exception scenarios. This will provide clear error handling for S3 file download operations.

Refhub/Tools/ExtentionMethod/BindAppSettingToModelExtentionMethod.cs (1)

9-14: LGTM! Proper configuration binding implementation.

The extension method correctly implements the configuration binding pattern using Configure<T> and maintains the fluent interface by returning the builder. The "AWS:S3" configuration section path is appropriate for S3 settings.

Refhub/Migrations/20250612083534_AddUniqueIndexToBookSlugAndModifyType.cs (1)

21-25: LGTM! Safe index dropping implementation.

The raw SQL implementation correctly addresses the previous concern about migration failures by using a conditional check against sys.indexes before dropping the existing index. This ensures the migration won't fail if the index doesn't exist.

Refhub/Refhub.csproj (4)

7-7: LGTM! UserSecretsId enables secure configuration management.

The addition of UserSecretsId is appropriate for managing sensitive AWS S3 credentials securely during development.


16-16: LGTM! AWS SDK package version is up-to-date.

The AWSSDK.S3 package version 4.0.4.2 is the latest version and addresses performance improvements mentioned in previous reviews.


28-30: LGTM! Proper configuration for public resource generation.

The PublicResXFileCodeGeneratorTargets property correctly enables public access to generated resource classes, which is necessary for validation attributes in the model classes.


33-33: LGTM! Resource generator change enables public access.

Changing from ResXFileCodeGenerator to PublicResXFileCodeGenerator is appropriate to make the Messages class public, enabling its use in validation attributes across the application.

Refhub/Models/AppSetting/S3Configuration.cs (1)

1-16: Excellent implementation following previous feedback.

The S3Configuration class properly implements all the suggestions from previous reviews:

  • Uses sealed modifier for immutability
  • Implements init accessors instead of set
  • Includes [Required] attributes for validation
  • Provides clear documentation about sensitive data handling
  • Makes ServiceURL nullable as appropriate
Refhub/Tools/Static/FolderNameStatic.cs (1)

7-17: Good refactoring with proper error handling.

The refactored implementation properly addresses the previous feedback by:

  • Using enum-based approach for type safety
  • Throwing ArgumentOutOfRangeException for unhandled enum values instead of a default case
  • Using nameof for consistent string mapping
  • Providing clear error messages

This eliminates the risk of unexpected behavior from unhandled enum values.

Refhub/Resources/Messages.resx (1)

225-236: Well-structured resource entries for file handling.

The new resource entries are appropriate for the S3 file handling functionality:

  • FileNotFound, InvalidFileName, DownloadError support error handling
  • Book_ImagePathMustBeUrl enforces URL validation requirements
  • Messages are clear and user-friendly
Refhub/Resources/Messages.fa.resx (2)

210-212: Translation corrected properly.

The Persian translation for Keyword_AlreadyExists has been corrected to "کلید واژه از قبل وجود دارد" which properly conveys "already exists" rather than just "found."


225-236: Appropriate Persian translations for file handling.

The Persian translations for the new file handling messages are accurate and user-friendly:

  • FileNotFound: "فایل مورد نظر پیدا نشد."
  • InvalidFileName: "نام فایل وارد شده نامعتبر است"
  • DownloadError: "خطا در دانلود فایل رخ داد"
  • Book_ImagePathMustBeUrl: "مسیر تصویر کتاب باید یک URL معتبر باشد"
Refhub/Service/Implement/BookService.cs (4)

8-8: Good addition of enum namespace.

Adding the Refhub.Models.Enums namespace supports the new bucket naming system with type safety.


160-161: Excellent migration to bucket-based file storage.

The implementation properly uses the new enum-based bucket naming system:

  • BucketNameStatic.GetName(BucketNames.BookPdf) for PDF files
  • BucketNameStatic.GetName(BucketNames.BookImages) for images
  • Method name corrected from UpdloadFile to UploadFile

This provides type safety and consistency across the application.


215-229: Consistent bucket naming in update operations.

The file update logic properly uses the new bucket naming system for both deletion and upload operations, ensuring consistency with the S3 migration.


268-274: Proper file deletion with bucket names.

File deletion operations correctly use the new bucket naming system, ensuring consistency with the S3 storage migration.

Refhub/Service/Interface/IFileUploaderService.cs (1)

5-5: Good parameter naming improvement!

The use of bucketName instead of directoryName makes the interface clearer and more aligned with S3 terminology.

Refhub/Migrations/20250611081248_fix-mig.Designer.cs (1)

1-542: Auto-generated migration file looks correct.

This EF Core migration designer file is auto-generated and properly defines the database schema including Identity entities and domain models. No manual modifications needed.

Refhub/Controllers/BookController.cs (1)

33-67: Well-implemented download endpoint with proper error handling.

The implementation addresses previous review comments by:

  • Using query parameter for fileUrl instead of route parameter
  • Proper URL validation
  • Dynamic content type determination
  • Specific exception handling with localized messages
  • Proper logging of errors
Refhub/Service/Implement/S3FileUploaderService.cs (3)

46-60: Good implementation of URL to key extraction.

The method properly validates that the URL starts with the expected prefix before extracting the key, which is more robust than using simple string replacement.


62-80: Well-implemented upload method with unique file naming.

The implementation correctly:

  • Generates unique file names using GUID to prevent overwrites
  • Uses proper stream disposal with using statement
  • Sets appropriate public read permissions

98-124: Correct implementation avoiding stream disposal issues.

The method properly copies the S3 response stream to a new MemoryStream before returning, which prevents the disposal issue mentioned in previous reviews. Good error handling with specific exception type.

Refhub/Areas/Admin/Views/ManageBook/Update.cshtml (1)

50-53: LGTM! Conditional rendering prevents broken image previews.

The null check correctly prevents broken images in the admin interface when updating books without image paths.

Refhub/appsettings.Development.json (1)

1-9: LGTM! Clean configuration focused on logging.

The simplified configuration appropriately removes connection strings and focuses on logging settings for development.

Refhub/Data/Configuration/BookAuthorConfiguration.cs (1)

13-21: LGTM! Proper cascade delete behavior implemented.

The cascade delete configuration correctly ensures that BookAuthor entries are automatically cleaned up when their parent Book or Author entities are deleted, maintaining referential integrity.

Comment thread Refhub/Views/Book/BookDetails.cshtml
Comment thread Refhub/Controllers/BookController.cs Outdated
@abolfazlshs80
Copy link
Copy Markdown
Contributor

/gemini review

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a major change by migrating file storage to S3. The implementation is well-structured, but there's a critical security issue regarding public file access and areas for improvement concerning code correctness and maintainability.

Comment thread Refhub/Service/Implement/S3FileUploaderService.cs Outdated
Comment thread Refhub/Data/Configuration/BookConfiguration.cs Outdated
Comment thread Refhub/Controllers/BookController.cs Outdated
Comment thread Refhub/Data/Configuration/BookConfiguration.cs Outdated
Comment thread Refhub/Data/Configuration/BookConfiguration.cs Outdated
Comment thread Refhub/Tools/Static/FolderNameStatic.cs Outdated
- correct parameter name from 'filePath' to 'fileUrl' in DownloadFile action link
- ensure proper file download functionality for related books
- removed duplicate HttpGet attribute for the download endpoint
- set uploaded file ACL to private to prevent unauthorized access
- prevent cascade delete when deleting a book to avoid unintended data loss
- restrict delete behavior for both RelatedTo and RelatedFrom relationships
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (4)
Refhub/Service/Implement/S3FileUploaderService.cs (4)

1-16: Missing essential using directives for compilation.

The code references IFormFile, Path, and MemoryStream but lacks the required using statements for Microsoft.AspNetCore.Http and System.IO.

+using System.IO;
+using Microsoft.AspNetCore.Http;
 using Microsoft.Extensions.Options;
 using Refhub.Models.AppSetting;
 using Refhub.Service.Interface;

84-85: Redundant null check for non-nullable parameter.

Since the method signature uses a non-nullable string bucketName, this null check is unnecessary and can be removed. The compiler ensures a non-null value is passed.

-            if (string.IsNullOrWhiteSpace(bucketName))
-                throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName));

100-101: Redundant null check for non-nullable parameter.

Similar to the DeleteFile method, this null check is unnecessary since bucketName is a non-nullable string parameter.

-            if (string.IsNullOrWhiteSpace(bucketName))
-                throw new ArgumentException("Bucket name cannot be null or empty.", nameof(bucketName));

110-118: Consider memory efficiency for large files.

The current implementation loads the entire file into memory, which could cause issues with large files. For better scalability, consider returning the S3 response stream directly or implementing streaming for large files.

However, the current approach may be acceptable depending on your use case and expected file sizes. The MemoryStream disposal responsibility is correctly handled by the caller.

🧹 Nitpick comments (1)
Refhub/Service/Implement/S3FileUploaderService.cs (1)

37-37: Remove commented-out dead code.

The commented assignment _bucketName = s3Options.Value.BucketName; should be removed since the _bucketName field is no longer used in the current implementation.

-            //_bucketName = s3Options.Value.BucketName;
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6544595 and 824547c.

📒 Files selected for processing (4)
  • Refhub/Controllers/BookController.cs (3 hunks)
  • Refhub/Data/Configuration/BookConfiguration.cs (1 hunks)
  • Refhub/Service/Implement/S3FileUploaderService.cs (1 hunks)
  • Refhub/Views/Book/BookDetails.cshtml (3 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
  • Refhub/Views/Book/BookDetails.cshtml
  • Refhub/Data/Configuration/BookConfiguration.cs
  • Refhub/Controllers/BookController.cs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: 🐳 Docker Build & Test
🔇 Additional comments (2)
Refhub/Service/Implement/S3FileUploaderService.cs (2)

39-60: LGTM! Robust URL handling implementation.

The helper methods correctly handle URL generation and key extraction with improved robustness:

  • Uses path-style URL format consistent with ForcePathStyle = true
  • GetKey properly validates URL prefix before extraction
  • Uses StartsWith + Substring instead of fragile Replace operations

62-80: LGTM! Secure and robust file upload implementation.

The upload method correctly addresses previous security and robustness concerns:

  • Generates unique keys with GUID to prevent overwrites
  • Uses S3CannedACL.Private for secure access control
  • Properly disposes of file stream with using statement

- remove commented-out line assigning bucket name
- this line was unnecessary and did not affect functionality
- remove bucket name validation from DeleteFile and DownloadFileAsync methods
- simplify the logic
@abolfazlshs80
Copy link
Copy Markdown
Contributor

/gemini review

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces S3 integration for file storage, enhancing scalability and reliability. Key improvements include the S3FileUploaderService, S3 configuration management, and improved data integrity with Entity Framework configurations. Code cleanup and bug fixes are also notable. Suggestions are provided to improve code clarity, consistency, and remove potential dead code.

Comment thread Refhub/Areas/Admin/Views/ManageBook/Update.cshtml Outdated
Comment thread Refhub/Service/Implement/S3FileUploaderService.cs Outdated
Comment thread Refhub/Tools/Static/FolderNameStatic.cs Outdated
Comment on lines +1 to +17
using Refhub.Models.Enums;

namespace Refhub.Tools.Static;

public static class FolderNameStatic
{
public static string GetDirectoryImages = "Images";
public static string GetDirectoryBooks = "Books";
public static string GetDirectoryFiles = "Files";
public static string GetDirectoryName(DirectoryTypes folder)
{
return folder switch
{
DirectoryTypes.Images => nameof(DirectoryTypes.Images),
DirectoryTypes.Books => nameof(DirectoryTypes.Books),
DirectoryTypes.Files => nameof(DirectoryTypes.Files),

_ => throw new ArgumentOutOfRangeException(nameof(folder), $"Unsupported directory type: {folder}")
};
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This static class FolderNameStatic and the associated DirectoryTypes enum appear to be remnants of the previous local file storage implementation. Since the application has been migrated to use S3 and the LocalFileUploaderService has been removed, this code is likely unused.

If it's confirmed to be dead code, it should be removed to keep the codebase clean and avoid confusion.

Comment thread Refhub/Controllers/BookController.cs Outdated
Comment on lines +36 to +39
if (string.IsNullOrWhiteSpace(fileUrl) || !Uri.TryCreate(fileUrl, UriKind.Absolute, out _))
{
return NotFound(messageService.Get("InvalidFileName"));
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider extracting the validation logic for fileUrl into a separate method or validator class. This would improve the readability and maintainability of the DownloadFile action by reducing its complexity and adhering to the Single Responsibility Principle.

Comment on lines 102 to 104
await context.Authors.AddAsync(author, ct);
await context.SaveChangesAsync(ct);
return true;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

It's crucial to handle potential exceptions that might occur during the SaveChangesAsync operation. Wrapping this call in a try-catch block would allow you to log any errors and potentially retry the operation or take other corrective actions.

try
{
    await context.SaveChangesAsync(ct);
}
catch (Exception ex)
{
    // Log the exception or handle it appropriately
    Console.WriteLine($"Error saving changes: {ex.Message}");
    throw;
}

Comment on lines +31 to 33
[Url(ErrorMessageResourceType = typeof(Messages),
ErrorMessageResourceName = nameof(Messages.Book_ImagePathMustBeUrl))]
public string? ImagePath { get; set; }
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Adding the [Url] attribute to the ImagePath property is a good step to ensure that the image path is a valid URL. However, consider adding client-side validation to provide immediate feedback to the user and prevent unnecessary server-side requests.

- refine conditional statement for image display
- ensure consistent code style for better readability
- remove directoryName parameter and use bucketName directly
- improve code readability and maintainability
- remove FolderNameStatic class
- the class is no longer needed
- add IsValidFileUrl method to check file URL validity
- return NotFound if file URL is invalid
- enable client-side validation in admin area
- include jquery, jquery-validation, and jquery-validation-unobtrusive libraries
- add a try-catch block around SaveChangesAsync to catch and log exceptions
- rethrow the exception to ensure it's handled upstream
- implement client-side validation for manage book form
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (2)
Refhub/Service/Implement/S3FileUploaderService.cs (2)

1-4: Missing required using directives for compilation.

The code references IFormFile, Path, and MemoryStream without importing their namespaces, which will cause compilation errors CS0246/CS0103.


80-93: Add input validation and error handling to delete method.

The method lacks input validation and error handling, which could lead to runtime exceptions or silent failures.

 public async Task DeleteFile(string realUrl, string bucketName)
 {
+    if (string.IsNullOrWhiteSpace(realUrl))
+        throw new ArgumentException("URL cannot be null or empty", nameof(realUrl));
+    
+    if (string.IsNullOrWhiteSpace(bucketName))
+        throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName));

     var key = GetKey(realUrl, bucketName);

     var request = new DeleteObjectRequest
     {
         BucketName = bucketName.ToLower(),
         Key = key
     };

-    await _s3Client.DeleteObjectAsync(request);
+    try
+    {
+        await _s3Client.DeleteObjectAsync(request);
+    }
+    catch (AmazonS3Exception ex) when (ex.StatusCode == System.Net.HttpStatusCode.NotFound)
+    {
+        // File already doesn't exist, consider this a successful deletion
+    }
 }
🧹 Nitpick comments (2)
Refhub/Service/Implement/S3FileUploaderService.cs (2)

8-16: Consider moving using directives to file level for better organization.

While the current nested using directives work, it's more conventional to place all using statements at the file level for better readability and maintainability.

+using System.IO;
+using Microsoft.AspNetCore.Http;
+using Amazon;
+using Amazon.Runtime;
+using Amazon.S3;
+using Amazon.S3.Model;
+using Refhub.Models.Enums;
+using Refhub.Resources;
+using Refhub.Tools.Exceptions;
+using System.Threading;
 using Microsoft.Extensions.Options;
 using Refhub.Models.AppSetting;
 using Refhub.Service.Interface;

 namespace Refhub.Service.Implement
 {
-    using Amazon;
-    using Amazon.Runtime;
-    using Amazon.S3;
-    using Amazon.S3.Model;
-    using Refhub.Models.Enums;
-    using Refhub.Resources;
-    using Refhub.Tools.Exceptions;
-    using System.Threading;

116-116: Remove Persian comment for consistency.

Replace the Persian comment with English for better code maintainability and international collaboration.

-        // مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود
+        // Handle cases where file doesn't exist or key is invalid
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6a71c2c and 7880d5a.

📒 Files selected for processing (7)
  • Refhub/Areas/Admin/Views/ManageBook/Update.cshtml (2 hunks)
  • Refhub/Areas/Admin/Views/Shared/_ValidationScriptsPartial.cshtml (1 hunks)
  • Refhub/Controllers/BookController.cs (3 hunks)
  • Refhub/Service/Implement/BookService.cs (5 hunks)
  • Refhub/Service/Implement/S3FileUploaderService.cs (1 hunks)
  • Refhub/Tools/Static/FolderNameStatic.cs (0 hunks)
  • Refhub/Views/Shared/_ValidationScriptsPartial.cshtml (0 hunks)
💤 Files with no reviewable changes (2)
  • Refhub/Views/Shared/_ValidationScriptsPartial.cshtml
  • Refhub/Tools/Static/FolderNameStatic.cs
✅ Files skipped from review due to trivial changes (1)
  • Refhub/Areas/Admin/Views/Shared/_ValidationScriptsPartial.cshtml
🚧 Files skipped from review as they are similar to previous changes (3)
  • Refhub/Areas/Admin/Views/ManageBook/Update.cshtml
  • Refhub/Service/Implement/BookService.cs
  • Refhub/Controllers/BookController.cs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: 🐳 Docker Build & Test
🔇 Additional comments (4)
Refhub/Service/Implement/S3FileUploaderService.cs (4)

22-37: Constructor implementation looks good with proper configuration.

The constructor correctly initializes the S3 client with dynamic region endpoint configuration and ArvanCloud-specific settings. The use of ForcePathStyle = true is appropriate for ArvanCloud compatibility.


38-42: URL generation method is correctly implemented.

The GenerateS3Url method now properly constructs path-style URLs using the ServiceURL from configuration, which aligns with the ForcePathStyle = true setting and matches the extraction logic in GetKey.


45-59: Key extraction method has robust implementation.

The GetKey method correctly uses StartsWith validation and Substring for safe key extraction, which is more robust than the previous Replace approach. The English error message improves consistency.


61-78: File upload implementation is secure and well-structured.

The method correctly uses S3CannedACL.Private for security, generates unique keys with GUID to prevent overwrites, and properly handles the file stream. The key generation strategy ensures uniqueness while maintaining readability.

Comment on lines +95 to +119
public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName)
{

var key = GetKey(fileUrl, bucketName);
var request = new GetObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key
};

try
{
using var response = await _s3Client.GetObjectAsync(request, ct);
var ms = new MemoryStream();
await response.ResponseStream.CopyToAsync(ms, ct);
ms.Position = 0;

return ms;
}
catch (AmazonS3Exception ex)
{
// مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود
throw new FileDownloadException("Error downloading file from S3", ex);
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add input validation and consider memory efficiency improvements.

The download method lacks input validation and loads entire files into memory, which could be problematic for large files.

 public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName)
 {
+    if (string.IsNullOrWhiteSpace(fileUrl))
+        throw new ArgumentException("File URL cannot be null or empty", nameof(fileUrl));
+    
+    if (string.IsNullOrWhiteSpace(bucketName))
+        throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName));

     var key = GetKey(fileUrl, bucketName);
     var request = new GetObjectRequest
     {
         BucketName = bucketName.ToLower(),
         Key = key
     };

     try
     {
         using var response = await _s3Client.GetObjectAsync(request, ct);
         var ms = new MemoryStream();
         await response.ResponseStream.CopyToAsync(ms, ct);
         ms.Position = 0;

         return ms;
     }
     catch (AmazonS3Exception ex)
     {
-        // مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود
         throw new FileDownloadException("Error downloading file from S3", ex);
     }
 }

For large files, consider returning the S3 response stream directly instead of copying to MemoryStream, but ensure proper disposal is handled by the caller.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName)
{
var key = GetKey(fileUrl, bucketName);
var request = new GetObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key
};
try
{
using var response = await _s3Client.GetObjectAsync(request, ct);
var ms = new MemoryStream();
await response.ResponseStream.CopyToAsync(ms, ct);
ms.Position = 0;
return ms;
}
catch (AmazonS3Exception ex)
{
// مثلاً اگر فایل وجود نداشت یا کلید اشتباه بود
throw new FileDownloadException("Error downloading file from S3", ex);
}
}
public async Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName)
{
if (string.IsNullOrWhiteSpace(fileUrl))
throw new ArgumentException("File URL cannot be null or empty", nameof(fileUrl));
if (string.IsNullOrWhiteSpace(bucketName))
throw new ArgumentException("Bucket name cannot be null or empty", nameof(bucketName));
var key = GetKey(fileUrl, bucketName);
var request = new GetObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key
};
try
{
using var response = await _s3Client.GetObjectAsync(request, ct);
var ms = new MemoryStream();
await response.ResponseStream.CopyToAsync(ms, ct);
ms.Position = 0;
return ms;
}
catch (AmazonS3Exception ex)
{
throw new FileDownloadException("Error downloading file from S3", ex);
}
}
🤖 Prompt for AI Agents
In Refhub/Service/Implement/S3FileUploaderService.cs around lines 95 to 119, add
input validation to check that fileUrl and bucketName are not null or empty
before proceeding. To improve memory efficiency for large files, modify the
method to return the S3 response stream directly instead of copying it into a
MemoryStream, and update the method's documentation to indicate that the caller
is responsible for disposing the returned stream.

@abolfazlshs80
Copy link
Copy Markdown
Contributor

/gemini review

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request migrates the application to use S3 for file storage. The changes include updates to the controller, service, and interface layers to support S3 file uploads and downloads. A critical bug was identified in the URL validation logic, and suggestions were made to improve logging and support cancellation of file operations.

Comment thread Refhub/Controllers/BookController.cs Outdated
}
private bool IsValidFileUrl(string fileUrl)
{
return !string.IsNullOrWhiteSpace(fileUrl) || !Uri.TryCreate(fileUrl, UriKind.Absolute, out _);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The condition !string.IsNullOrWhiteSpace(fileUrl) || !Uri.TryCreate(fileUrl, UriKind.Absolute, out _) uses a logical OR, which means the method returns true if the URL is not whitespace OR if the URL is invalid. This is likely incorrect. It should use a logical AND && to ensure the URL is both not whitespace AND a valid URI.

        return !string.IsNullOrWhiteSpace(fileUrl) && Uri.TryCreate(fileUrl, UriKind.Absolute, out _);

catch (Exception ex)
{
// Log the exception or handle it appropriately
Console.WriteLine($"Error saving changes: {ex.Message}");
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider using ILogger<BookService> for logging exceptions instead of Console.WriteLine. This allows for structured logging and integration with logging frameworks.

Comment on lines +169 to +170
var filePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug);
var imagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The CancellationToken should be passed to the UploadFile method to allow for cancellation of the upload operation. This requires updating the IFileUploaderService interface and its implementations to accept a CancellationToken.

            var filePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug, ct);
            var imagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug, ct);

Comment on lines +224 to +237
await uploaderService.DeleteFile(existingBook.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf));
}

existingBook.FilePath = await uploaderService.UpdloadFile(book.File, FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.Slug);
existingBook.FilePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug);
}

if (book.Image != null)
{
if (!string.IsNullOrWhiteSpace(existingBook.ImagePath))
{
await uploaderService.DeleteFile(FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, existingBook.ImagePath);
await uploaderService.DeleteFile(existingBook.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages));
}

existingBook.ImagePath = await uploaderService.UpdloadFile(book.Image, FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.Slug);
existingBook.ImagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Pass the CancellationToken to the DeleteFile and UploadFile methods to allow for cancellation of these operations.

            if (book.File != null)
            {
                if (!string.IsNullOrWhiteSpace(existingBook.FilePath))
                {
                    await uploaderService.DeleteFile(existingBook.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf), ct);
                }

                existingBook.FilePath = await uploaderService.UploadFile(book.File, BucketNameStatic.GetName(BucketNames.BookPdf), book.Slug, ct);
            }

            if (book.Image != null)
            {
                if (!string.IsNullOrWhiteSpace(existingBook.ImagePath))
                {
                    await uploaderService.DeleteFile(existingBook.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages), ct);
                }

                existingBook.ImagePath = await uploaderService.UploadFile(book.Image, BucketNameStatic.GetName(BucketNames.BookImages), book.Slug, ct);

Comment on lines +277 to +282
await uploaderService.DeleteFile(book.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages));
}

if (!string.IsNullOrWhiteSpace(book.FilePath))
{
await uploaderService.DeleteFile(FolderNameStatic.GetDirectoryBooks, FolderNameStatic.GetDirectoryImages, book.FilePath);
await uploaderService.DeleteFile(book.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf));
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Pass the CancellationToken to the DeleteFile methods to allow for cancellation of these operations.

            if (!string.IsNullOrWhiteSpace(book.ImagePath))
            {
                await uploaderService.DeleteFile(book.ImagePath, BucketNameStatic.GetName(BucketNames.BookImages), ct);
            }

            if (!string.IsNullOrWhiteSpace(book.FilePath))
            {
                await uploaderService.DeleteFile(book.FilePath, BucketNameStatic.GetName(BucketNames.BookPdf), ct);

Comment on lines +61 to +77
public async Task<string> UploadFile(IFormFile file, string bucketName, string name)
{
var key = $"{name.Replace(" ", "-")}_{Guid.NewGuid()}{Path.GetExtension(file.FileName)}";

using var stream = file.OpenReadStream();
var request = new PutObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key,
InputStream = stream,
ContentType = file.ContentType,

CannedACL = S3CannedACL.Private
};

await _s3Client.PutObjectAsync(request);
return GenerateS3Url(key, bucketName);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Add a CancellationToken parameter to the UploadFile method and pass it to the PutObjectAsync method to allow for cancellation of the upload operation.

        public async Task<string> UploadFile(IFormFile file, string bucketName, string name, CancellationToken ct)
        {
            var key = $"{name.Replace(" ", "-")}_{Guid.NewGuid()}{Path.GetExtension(file.FileName)}";

            using var stream = file.OpenReadStream();
            var request = new PutObjectRequest
            {
                BucketName = bucketName.ToLower(),
                Key = key,
                InputStream = stream,
                ContentType = file.ContentType,

                CannedACL = S3CannedACL.Private
            };

            await _s3Client.PutObjectAsync(request, ct);
            return GenerateS3Url(key, bucketName);
        }

Comment on lines +80 to +92
public async Task DeleteFile(string realUrl, string bucketName)
{


var key = GetKey(realUrl, bucketName);

var request = new DeleteObjectRequest
{
BucketName = bucketName.ToLower(),
Key = key
};

await _s3Client.DeleteObjectAsync(request);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Add a CancellationToken parameter to the DeleteFile method and pass it to the DeleteObjectAsync method to allow for cancellation of the delete operation.

        public async Task DeleteFile(string realUrl, string bucketName, CancellationToken ct)
        {


            var key = GetKey(realUrl, bucketName);

            var request = new DeleteObjectRequest
            {
                BucketName = bucketName.ToLower(),
                Key = key
            };

            await _s3Client.DeleteObjectAsync(request, ct);
        }

Comment on lines +5 to +7
Task<string> UploadFile(IFormFile file, string bucketName, string name);
Task DeleteFile(string realUrl, string bucketName);
Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Add a CancellationToken parameter to the UploadFile and DeleteFile methods to allow for cancellation of these operations.

    Task<string> UploadFile(IFormFile file, string bucketName, string name, CancellationToken ct);
    Task DeleteFile(string realUrl, string bucketName, CancellationToken ct);
    Task<Stream> DownloadFileAsync(string fileUrl, CancellationToken ct, string bucketName);

- fix the condition in IsValidFileUrl to properly validate file URLs
- ensure that the file URL is not null or whitespace and is a valid absolute URI
- add validation for fileUrl and bucketName to prevent empty values
- throw ArgumentException if fileUrl or bucketName is null or empty
@abolfazlshs80
Copy link
Copy Markdown
Contributor

/gemini review

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces S3 support for file storage, enhancing scalability and robustness. The changes are comprehensive, covering data access, services, and views. Review comments focus on dependency injection, logging, data integrity, and localized error messages to improve maintainability and user experience.

Comment on lines +107 to +111
catch (Exception ex)
{
// Log the exception or handle it appropriately
Console.WriteLine($"Error saving changes: {ex.Message}");
throw;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Using Console.WriteLine for logging exceptions is not suitable for production environments as it can be easily missed and doesn't provide structured logging capabilities. Inject ILogger<BookService> and use it for logging. For example:

_logger.LogError(ex, "Error saving changes: {Message}", ex.Message);
// Log the exception or handle it appropriately
_logger.LogError(ex, "Error saving changes: {Message}", ex.Message);
throw;

Comment on lines +22 to +37
public S3FileUploaderService(IOptions<S3Configuration> s3Options)
{
this._s3Options = s3Options;

var credentials = new BasicAWSCredentials(s3Options.Value.AccessKey, s3Options.Value.SecretKey);

var config = new AmazonS3Config
{
RegionEndpoint = RegionEndpoint.GetBySystemName(s3Options.Value.Region), // Dynamically set region endpoint
ServiceURL = s3Options.Value.ServiceURL,
ForcePathStyle = true // بسیار مهم برای کار با آروان‌کلاد
};

_s3Client = new AmazonS3Client(credentials, config);

}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Instead of manually creating the IAmazonS3 client, register it with the dependency injection (DI) container and inject the IAmazonS3 interface directly. The AWS SDK for .NET provides helper methods for this, like services.AddAWSService<IAmazonS3>().

Register it in Program.cs:

builder.Services.AddDefaultAWSOptions(builder.Configuration.GetAWSOptions());
builder.Services.AddAWSService<IAmazonS3>();

And then your service's constructor would be simplified to:

public S3FileUploaderService(IAmazonS3 s3Client, IOptions<S3Configuration> s3Options)
{
    _s3Client = s3Client;
    _s3Options = s3Options;
}

Comment on lines +31 to 33
[Url(ErrorMessageResourceType = typeof(Messages),
ErrorMessageResourceName = nameof(Messages.Book_ImagePathMustBeUrl))]
public string? ImagePath { get; set; }
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider validating ImagePath as a URL to ensure data integrity. Using the [Url] data annotation[^1] enforces that the ImagePath is a valid URL, which is important when using S3.

[Url(ErrorMessageResourceType = typeof(Messages),
ErrorMessageResourceName = nameof(Messages.Book_ImagePathMustBeUrl))]
public string? ImagePath { get; set; }

Comment on lines +39 to +42
if (!IsValidFileUrl(fileUrl))
{
return NotFound(messageService.Get("InvalidFileName"));
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Returning NotFound without a localized message makes it difficult to understand the error. Consider returning a localized message using messageService.Get("InvalidFileName")[^1] to provide more context to the user.

return NotFound(messageService.Get("InvalidFileName"));

Comment on lines 32 to +36
builder.HasMany(b => b.RelatedTo).WithOne(rt => rt.Book)
.HasForeignKey(bt => bt.BookId);
.HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Restrict);

builder.HasMany(b => b.RelatedFrom).WithOne(rf => rf.RelatedBook)
.HasForeignKey(rf => rf.RelatedBookId);
.HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Restrict);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Enforce restricted delete behavior on related book relationships to prevent unintended deletions. This ensures data integrity and prevents accidental data loss[^1].

builder.HasMany(b => b.RelatedTo).WithOne(rt => rt.Book)
      .HasForeignKey(bt => bt.BookId).OnDelete(DeleteBehavior.Restrict);

builder.HasMany(b => b.RelatedFrom).WithOne(rf => rf.RelatedBook)
    .HasForeignKey(rf => rf.RelatedBookId).OnDelete(DeleteBehavior.Restrict);

Comment on lines +285 to +289
var relations = await context.BookRelations
.Where(br => br.BookId == bookId || br.RelatedBookId == bookId)
.ToListAsync(ct);
if (relations.Any())
context.BookRelations.RemoveRange(relations);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider removing related BookRelation entities before deleting a book to maintain data integrity. This prevents orphaned records and ensures consistency in the database[^1].

var relations = await context.BookRelations
                                       .Where(br => br.BookId == bookId || br.RelatedBookId == bookId)
                                       .ToListAsync(ct);
if (relations.Any())
    context.BookRelations.RemoveRange(relations);

@hootanht hootanht self-requested a review July 22, 2025 20:26
@labels-and-badges labels-and-badges Bot added the PR:APPROVED Review is approved label Jul 22, 2025
@hootanht hootanht merged commit 7effd2a into develop Jul 22, 2025
4 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

NO JIRA This PR does not have a Jira Ticket PR:APPROVED Review is approved PR:size/XXL Denotes a Pull Request that changes 1000+ lines.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants