Are there documented errors, bug reports, or user reviews about CalcSD's calculators?
Executive summary
CalcSD’s own site documents and discusses methodological limits, data errors, and stability fixes—its changelog lists specific bug fixes and upgrades [1], the site asks users to report instability [2], and an internal dataset page flags a dataset “not recommended” for mixing incompatible measurements and listing corrected errors [3]; independent third‑party review sites treat calcsd.info as a legitimate site but flag general trust/quality metrics rather than specific calculator bugs [4] [5] [6].
1. CalcSD documents methodological caveats and invites bug reports
CalcSD explicitly warns users about accuracy limits and asks to be informed of instability or issues with the site [2], and a dedicated “How calcSD makes its calculations” page describes the statistical approximations used, explains why the team chose a normal approximation rather than a log‑normal model, and cautions that displayed rounded numbers are not used in internal calculations [7], which together amount to an acknowledgement that results can be sensitive to rounding, distributional assumptions, and dataset selection.
2. The project maintains a changelog showing fixes and version changes
The site’s changelog records concrete maintenance activity—version notes include re‑implementing volume percentile calculations using a bivariate distribution in 2019, UI and calculator fixes in 2023, and a v3.4 update in November 2024 that “re‑added many datasets” and improved volume calculations [1], which functions as a running public bug‑fix record rather than formal external bug reports but does document that issues have been found and addressed over time.
3. There are documented data errors inside the site’s own dataset documentation
CalcSD’s dataset list page publicly flags at least one dataset as “not a recommended dataset” because it contains “many errors and mixing bone‑pressed and non bone‑pressed measurements,” and it shows corrected counts for sample sizes, which is explicit self‑reporting of dataset errors that affect the calculators’ inputs [3]; that admission is evidence the site tracks data quality issues that could produce incorrect percentiles if users select problematic datasets.
4. User reviews and external assessments focus on legitimacy and trust, not detailed calculator bug logs
Third‑party review aggregators and scam‑checkers treat calcsd.info as legitimate or mixed‑trust rather than outright fraudulent—ScamAdviser reports the site appears “legit and safe” [5], ScamDoc shows an average trust score and recommends more investigation [6], and ScamDetector publishes a review page evaluating technical and reputation indicators [4]—but none of these sources provide systematic user bug reports or reproducible error cases for the calculators themselves, so external reviews contribute reputation context but not a catalog of calculator bugs.
5. The project’s owner position and transparency affect how to interpret reported “errors”
The site’s About page identifies calcSD as a hobby project created by a single developer and notes the team’s efforts to aggregate and analyze academic sources while acknowledging methodological tradeoffs [8]; that explicit framing helps explain the pattern: public changelog entries and dataset caveats are the project’s primary form of error disclosure rather than a formal issue tracker or published peer review, which means users find documented problems mostly through site pages [3] [1] rather than structured external bug reports.
6. Balance of evidence: documented issues exist but are primarily internal and methodological
There are documented errors and caveats: the site self‑reports dataset errors [3], discusses methodological limits and precision concerns [7], and logs multiple rounds of fixes in its changelog [1], while external review sites comment on legitimacy without producing independent bug logs [4] [5] [6]; what is absent from the sourced record is a public, external catalogue of reproducible calculation errors or user‑submitted bug tickets beyond the site’s own notices—if more formal bug reports or user reviews are needed, those are not present in the provided sources.