Several decades ago, Michel Bégin, a collector from Québec, Canada, put together a study of worldwide collecting areas to determine which were the most cost-effective collecting areas. He did this by plowing through a 1997 Scott Classic Specialized Catalogue, compiling data on all mint issues from 1840-1940. He then quantified his finding through a simple formula that allowed him to capture which collecting areas had a low percentage of high-catalog value (CV) issues, and no issues with a really high CV.
Quantity with CV >$100
———————————————— x Highest CV
Total quantity issued
He published his findings on his website, where they sat until several years later, when the site lapsed and disappeared. It is currently only available through Internet archives.
The Current Project
In November 2016, a user on a popular stamp forum brought it to the community’s attention that the site was still available, though only insofar as the Internet archive maintained a copy. Upon hearing this, I decided to tackle the task of rebuilding something of comparable design.
About the Data
When Mr. Bégin undertook his study, he manually paged through a Scott catalog to assemble his data. I was not about to attempt that. I had the benefit of having access to data from the Michel catalog in electronic format, so that is the catalog I chose to use in assembling the current data.
Data in the Worldwide tables is from the 2015-2016 Michel catalog, with the exception of Germany data, which is from Michel’s 2015 Deutschland-Spezial.
The Table Types
Due to page-width limitations, each type (Mint & Used) is broken down into 2 tables. Additionally, there are two versions of the tables — a broad version and a detailed version. The detailed version contains many sub-areas for the various collecting areas (ex., USA, USA Officials, USA Postmaster Provisionals, etc.), while the broad version has the sub-areas grouped under broader collecting areas (ex., USA). They can be accessed from the buttons at top right of each page.
- Mint (#) — Mint issues by price range
- Mint (%) — Mint percentage of total issues within the price ranges
- Used (#) — Used issues by price range
- Used (%) — Used percentage of total issues within the price ranges
Using the Tables
The table headers are across the top of the table, with filter boxes immediately below. Each header can be clicked to change the sort order to ascending or descending. The filter boxes allow you to search for text (“IA” filter), or set a range of values to display (all others columns).
A Few Notes & Tips
- There may be some translation errors in the IA names. The originals were in German, and I’m not fluent.
- I have not attempted to verify the accuracy of the data at an individual stamp level, as there were almost a million individual stamps in the data. I suspect there may be slight errors in places, as Michel sometimes lists set prices in addition to or in lieu of individual stamp CVs, which can lead to some duplication in the pricing data.
- I didn’t have access to unspecialized Michel data on the German areas, so the data from German areas is from the Specialized catalog, whereas the remaining Worldwide data is from unspecialized catalogs. Keep this in mind if comparing German collecting areas to non-German ones, as the German ones will contain far more listings than you’d find in an unspecialized catalog, including many rare varieties which can skew the pricing data.
- Finally, be aware that the AQ can be skewed by CV TBD and No CV. For mathematical purposes, CV TBD and No CV are treated as having a CV of zero in the database. This should not be mistaken for an actual CV of zero. For example, if an entire collecting area is only available Mint, there will be no Used values (No CV will be 100% of issues), and the Used AQ will therefore equal zero. If there is even a single issue with a CV in one of the given price ranges (€0-€10, €10-€50, €50-€100, >€100), the AQ will be larger than zero, so the easy way to exclude these outliers is to set the “From” filter in the AQ column to “.01”.
For the Germany-only version, see this thread.