Accept sizes with k/M/G/T/E suffixes.#194
Open
JurjenBokma wants to merge 1 commit intopauldreik:mainfrom
Open
Conversation
Author
|
Fixes #134 |
JurjenBokma
commented
May 28, 2025
| #include <cstdlib> | ||
| #include <cstring> | ||
| #include <iostream> | ||
| #include <stdexcept> |
Author
There was a problem hiding this comment.
File size conversion throws runtime_error on invalid suffix. So we need to include stdexcept.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Change
Have file and buffer size options accept arguments like '2k', '6M', '20G'.
Reasoning
I typed something like:
rdfind -minsize 1G .and got confusing output until I figured the 'G' suffix was not supported.I would like such file size suffixes to work.
Testing
To my shame, I did only cursorily test the new feature, and neglected to re-run the tests already present.
Note
Thank you for sharing rdfind. Having to dedup a few TB on disk, I found it useful.