Skip to content

Develop#207

Open
SaXoNG wants to merge 4 commits intomate-academy:masterfrom
SaXoNG:develop
Open

Develop#207
SaXoNG wants to merge 4 commits intomate-academy:masterfrom
SaXoNG:develop

Conversation

@SaXoNG
Copy link
Copy Markdown

@SaXoNG SaXoNG commented Feb 24, 2026

No description provided.

Copy link
Copy Markdown

@mateacademy-ai-mentor mateacademy-ai-mentor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

Great job setting up the server and handling the routing and validation correctly! However, this submission requires changes to meet all the project requirements. The most critical issue is that the server currently buffers the entire file into memory before compression, which doesn't meet the 'use Streams' requirement. For large files, this is very inefficient. Please refactor this to pipe the request stream through a zlib transform stream and then directly to the response. Additionally, the compressed file extensions are incorrect. The server is generating names like file.txt.gzip instead of the required file.txt.gz. You'll need to map the compressionType from the form (gzip, deflate, br) to the correct extensions (.gz, .dfl, .br) in the Content-Disposition header.


✨ Please rate this review in the Platform's notification center1. I will be grateful for your feedback! ✨

Important

Fix the issues, commit the changes, and then re-request my review using the Re-request review button2 in the Reviewers tab.

Tip

If you have any comments about this review, please leave a message in the Add a comment section below and mention me with the @mateacademy-ai-mentor tag3.
I will read your message during my next review of your pull request. Please do not reply directly to my comments, as that feature is not supported yet.

Caution

Do not close the pull request until the review is complete. Closing it early can lead to unexpected behavior and disrupt the review process.

Footnotes

  1. Rate AI review example

  2. Re-request AI review example

  3. Add AI review comments example


req.on('data', (chunk) => chunks.push(chunk));

req.on('end', () => {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logic inside this 'end' event handler doesn't meet the 'use Streams' requirement. Instead of processing the file in chunks as it arrives, your code buffers the entire request body into memory, which is inefficient for large files. A streaming approach would involve using a library to parse the multipart form data from the request stream, and then piping the file part through a zlib transform stream into the response.


res.setHeader(
'Content-Disposition',
`attachment; filename=${filename}.${compressionType}`,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The requirement specifies the file extensions should be .gz, .dfl, and .br. This implementation uses the compressionType value directly from the form, which would result in extensions like .gzip and .deflate. You'll need to map the compression type to the correct extension.

@SaXoNG SaXoNG marked this pull request as draft February 24, 2026 14:55
@SaXoNG SaXoNG marked this pull request as ready for review February 24, 2026 14:55
Copy link
Copy Markdown

@mateacademy-ai-mentor mateacademy-ai-mentor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

Thank you for your submission. This requires a few revisions to meet the project's core requirements. The server currently buffers the entire file into memory before compression, which goes against the central requirement to use streams. Please refactor this to pipe the incoming request stream through a zlib transform stream (e.g., zlib.createGzip()) directly to the response. Additionally, the compressed file extensions are incorrect; you need to map the compressionType from the form (e.g., gzip) to the correct extension (e.g., .gz). Lastly, the src/index.html file appears to be missing, which prevents the application from being tested, so please restore it.


✨ Please rate this review in the Platform's notification center1. I will be grateful for your feedback! ✨

Important

Fix the issues, commit the changes, and then re-request my review using the Re-request review button2 in the Reviewers tab.

Tip

If you have any comments about this review, please leave a message in the Add a comment section below and mention me with the @mateacademy-ai-mentor tag3.
I will read your message during my next review of your pull request. Please do not reply directly to my comments, as that feature is not supported yet.

Caution

Do not close the pull request until the review is complete. Closing it early can lead to unexpected behavior and disrupt the review process.

Footnotes

  1. Rate AI review example

  2. Re-request AI review example

  3. Add AI review comments example

Comment on lines +42 to +45
req.on('data', (chunk) => chunks.push(chunk));

req.on('end', () => {
const body = Buffer.concat(chunks).toString('binary');
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation buffers the entire request body into memory before processing it, which does not meet the core 'use Streams' requirement. A streaming approach processes data in chunks as it arrives. You should refactor this to parse the incoming request stream, pipe the file data through a zlib transform stream (e.g., zlib.createGzip()), and then pipe the result to the response stream. This avoids loading the entire file into memory.

Comment on lines +81 to +94
const compressors = {
gzip: zlib.gzipSync,
deflate: zlib.deflateSync,
br: zlib.brotliCompressSync,
};

if (!fileBuffer || !compressionType || !compressors[compressionType]) {
res.statusCode = 400;
res.end();

return;
}

const compressed = compressors[compressionType](fileBuffer);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The use of synchronous zlib methods blocks the server's event loop, which is inefficient. A streaming implementation would use asynchronous, stream-based methods like zlib.createGzip(), zlib.createDeflate(), and zlib.createBrotliCompress() which can be connected using .pipe().


res.setHeader(
'Content-Disposition',
`attachment; filename=${filename}.${compressionType}`,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The filename extension is being set directly from the compressionType variable, which results in extensions like .gzip. According to the task requirements, you need to map the compression type to a specific extension: gzip -> .gz, deflate -> .dfl, and br -> .br.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants