tl;dr Browsing via Tor is still fine, Hosting onions ... (possibly) less fine.
The ability to correlate IP State to Onion state (is it down, is it down, is it up, is it up) -
There are three* approaches below that balance on the same principle issue - that is; if an IP isn't reachable, and the site or service depending on a .onion address goes down... that's interesting, possibly *very interesting*.
Quickest Method last, most likely method first.
1 a) Monitor target onion addresses, log when the onion fails to resolve consistently over a $time, retrospectively investigate where those outages might exist in network operators/OSINT/etc...
1 b) Monitor ISP outages and on outage notification query your target onions (version of 1)
2) Create outages for milliseconds on network/routing layers query onion state inside those windows, see what stays up, see what goes down (most aggressive/unlikely)
Interruption could be by way of null routing IP addresses for the time of a average onion to resolve, there may be other available techniques, this will work at high level theory, but further exploration of interruption methods at top level may reveal more techniques – has to be least impact on addresses, most problematic, yet would reveal huge amount of location if it had a high % of reach, and where it cant reach, you can take the 'we know it's not where we can reach' at the time of discovery
the onion network is getting smaller for hosting sites, but onions are still used in some file serving softwares (such as onionshare)
host your site in multiple locations, burn onion addresses per use case if using them for onionshare etc ...
Why this model will fail
Cost, effort versus reward. legal.
possibly negate this with optimising the state identification (if the information is available)