I have an idea that another way of preventing being tracked is just massively spamming trash in the data layer object, pushing thousands of dollars worth of purchase events and such, pushing randomly generated user details and other such events. Perhaps by doing this your real data will be hard to filter out. A side effect is also that data becomes unreliable overall, helping less privacy aware people in the process.
Since installing it on firefox on this computer (18 months ago or so) Ad Nauseam has clicked ~$38,000 worth of ads, that i never saw.
Between this and "track me not" i've been fighting back against ads and connecting my "profile" with any habits since 2016 or so. I should also note i have pihole and my own DNS server upstream, so that's thiry-eight grand in ad clicks that got through blacklists.
[Preface: I hate ads, I love uBlock origin, I use pihole, I'm a proponent of ad blockers]
I manage a Google Ads account with a $500,000 budget. That budget is spent on a mix of display ads, google search, and youtube ads.
If I knew that 10% of our budget was wasted on bot clicks, there's nothing I can do as an advertiser. We can't stop advertising... we want to grow our business and advertising is how you get your name out there. We also can't stop using Google Ads - where else would we go?
$38,000 in clicks boosts Google's revenue by $38k (Google ain't complaining). The only entity you're hurting are the advertisers using Google. Advertisers might see their campaigns performing less well, but that's not going to stop them from advertising. If anything, they'll increase budgets to counteract the fake bot clicks.
I really don't understand what Ad Nauseam is trying to achieve. It honestly seems like it benefits Google more than it hurts them. It directly hurts advertisers, but not enough that it would stop anyone from advertising.
Google has a system for refunding advertisers for invalid clicks. The $500k account that I manage gets refunded about $50/month in invalid clicks. I'm guessing if bot clicks started making a real dent in advertiser performance, Google would counter that by improving their bot detection so they can refund advertisers in higher volumes. If there's ever an advertiser-led boycott of Google Ads, Google would almost certainly respond by refunding advertisers for bot clicks at much higher rates.
> I really don't understand what Ad Nauseam is trying to achieve. It honestly seems like it benefits Google more than it hurts them.
Google is part of the problem, but they're neither the only ones nor best to target through bottom-up approaches.
> It directly hurts advertisers, but not enough that it would stop anyone from advertising.
You know the saying about XML - if it doesn't solve the problem, you are not using enough of it.
> there's nothing I can do as an advertiser. We can't stop advertising...
We know. The whole thing is a cancer[0], a runaway negative feedback loop. No single enlightened advertiser can do anything about it unilaterally. Which is why the pressure needs to go up until ~everyone wants change.
> I'm guessing if bot clicks started making a real dent in advertiser performance, Google would counter that by improving their bot detection so they can refund advertisers in higher volumes.
They already have methods to detect a lot. Like you said yourself, customers have no alternative, so why would they refund money they don't have to?
The point is to poison your ad tracking profile so that advertisers can't figure out who you are and what you'll buy.
No matter how secure your browser setup is, Google is tracking you. By filling their trackers with garbage, there's less that can personally identify you as an individual
> It honestly seems like it benefits Google more than it hurts them. It directly hurts advertisers, but not enough that it would stop anyone from advertising.
GP fights agains ads, not Google. And not being able to win 100% of the gain shouldn’t restrain someone from taking action it they consider the win share worth the pain.
> $38,000 in clicks boosts Google's revenue by $38k
You should include costs here, and if (big if) a substantial part of the clicks comes from bots and get refunded, the associated cost comes on top of the bill. At the end the whole business is impacted. I agree 50/50k is a penny through.
> I hate ads […] I manage a Google Ads account
[no cynism here, I genuinely wonder] how do you manage your conscience, mood and daily motivation? Do you see a dichotomy in what you wrote and if so, how did you arrive to that situation? Any future plan?
I’m asking as you kind of introduce the subject but if you’re not willing to give more details that’s totally fine.
I’d imagine that by this point in time, they are able to filter this specific type of noise out of the dataset. They have been tracking everyone for so long that I doubt there’s anyone they don’t know about whether directly of shadow profiles. These randomly generated users would just not match up to anything and would be fine to just drop
Am I dumb or does this article fail to explain what does the tag manager actually do? And not just with a loaded word, such as surveillance or spying, but actually technically explain what they are selling for and why it is bad.
Google Tag Manager is a single place for you to drop in and manage all the tracking snippets you might want to add to your site. When I've worked on B2C sites that run a lot of paid advertising campaigns, the marketing team would frequently ask me to add this tracking pixel or another, usually when we were testing a new ad channel. Want to start running ads on Snapchat? Gotta ad the Snapchat tracker to your site to know when users convert. Now doing TikTok? That's another snippet. Sometimes there would be additional business logic for which pages to fire or not fire, and this would change more often. Sometimes it was so they could use a different analytics tool.
While these were almost always very easy tickets to do, they were just one more interruption for us and a blocker for the stakeholders, who liked to have an extremely rapid iteration cycle themselves.
GTM was a way to make this self-service, instead of the eng team having to keep this updated, and also it was clear to everyone what all the different trackers were.
The chief reason is that websites pay for advertising and want to know if the advertising is working and Google tag manager is the way to do that, for Google Ads.
This is not unreasonable! People spend a lot of money on ads and would like to find out if and when they work. But people act like its an unspeakable nebulous crime but this is probably the most common case by miles.
I was tasked with auditing third party scripts at a client a couple of years ago, the marketing people where unable to explain wtf tag manager does concretely without resorting to ‚it tracks campaign engagement´ mumbo jumbo, but were adamant they they can’t live without it.
Google Tag Manager lets you add tracking stuff on your website without needing to touch the code every time. So if you want to track things like link clicks, PDF downloads, or people adding stuff to their cart.
It doesn't track things by itself. It just links your data to other tools like Google Analytics or Facebook Pixel to do the tracking.
This kind of data lets businesses do stuff like send coupon emails to people who left something in their cart.
There are lots of other uses. Basically, any time you want to add code or track behavior without dealing with a developer.
Since you're asking, you could use it to tie together triggers and actions to embed code in specific situations (eg. based on the URL or page state). It has automatic versioning. There's a preview feature for testing code changes before deploying, and a permission system for sharing view/edit access with others.
There's a section in the article titled, "WHAT DOES GOOGLE TAG MANAGER DO?":
> Whilst Google would love the general public to believe that Tag Manager covers a wide range of general purpose duties, it's almost exclusively used for one thing: surveillance.
It really isn't. I've been blocking all JavaScript for years now, selectively allowing what is essential for sites to run or using a private session to allow more/investigate/discover. Most sites work fine without their 30 JS sources, just allowing what is hosted on their own domain. It takes a little effort, but it's a fair price to pay to have a sane Internet.
The thing is - with everything - it's never easy to have strong principles. If it were, everyone would do it.
It's certainly not that bad if you have uMatrix to do it with, but I haven't found a reasonable way to do it on mobile. uMatrix does work on Firefox Mobile but the UI is only semi functional.
Using Firefox Add-Ons on a "smartphone" sucks because one has to access every Add-On interface via an Extensions menu.
In that sense _all_ Add-Ons are only semi-functional.
I use multiple layers: uMatrix + NetGuard + Nebulo "DNS Rules", at the least. Thus I have at least three opportunities where I can block lookups for and requests to Google domains.
Having tried both, IMHO they do not do exactly the same thing. One is pattern-based, the other is host-based. As such, one can use them together, simultaneously.
Not quite the same (I love uMatrix UI), but advanced mode in uBO is similar. It lacks filtering by data type (css, js, images, fonts,...) per domain, but it does resolve domains to their primary domain, revealing where they are hosted. A huge kudos to gorhill for both of these!
Yup that's what I use as well. With whatever the name of the extension that makes allowing cookies a whitelist thing too, and PrivacyBadger/Decentraleyes.
Also, deleting everything when Firefox closes. It's a little annoying to re-login to everything every day, but again, they are banking on this inconvenience to fuck you over and I refuse to let them win. It becomes part of the routine easily enough.
That’s my default as well. Self hosted/1st party scripts can load, but 3rd party scripts are blocked. The vast majority of sites work this way. If a site doesn’t work because they must have a 3rd party script to work, I tend to just close the tab. I really don’t feel like it has caused me to miss anything. There’s usually 8 other sites with the same data in a slightly less hostile site
Impossible to know because when I disable Javascript "the majority of the internet" works fine. As does a majority of the web.
I read HN and every site submitted to HN using TCP clients and a text-only browser, that has no Javascript engine, to convert HTML to text.
The keyword is "read". Javascript is not necessary for requesting or reading documents. Web developers may use it but that doesn't mean it is necessary for sending HTTP requests or reading HTML or JSON.
If the web user is trying to do something else other than requesting and reading, then perhaps it might not "work".
Echoing others, I've used NoScript for years and at this point it is practically unnoticeable.
Many sites work without (some, like random news & blogs, work better). When a site doesn't work, I make a choice between temporarily or permanently allowing it depending on how often I visit the site. It takes maybe 5 seconds and I typically only need to spend that 5 seconds once. As a reward, I enjoy a much better web experience.
StackOverflow switched over from spying with ajax.google.com to GTM in the past year or so. All for some pointless out of date jQuery code they could self-host. I wonder how much they're being paid to let Google collect user stats from their site.
If you're spending 99% of your time on your favourite websites that you've already tuned the blocking on? Barely a problem.
On the other hand if your job involves going to lots of different vendors' websites - you'll find it pretty burdensome, because you might end up fiddling with the per-site settings 15+ times per day.
If I’m at work using a work provided computer, I don’t bother with the blocking. They are not tracking me as I do not do anything as me. I’m just some corporate stooge employee that has no similarity to me personally.
My personal devices block everything I can get away with
The sites that don't work are usually the worst websites around - you end up not missing much. And if it's a store or whatever, you can unblock all js when you actually want to buy.
About as tiring as hearing about it all the time. Thank god it's a fringe topic these days but this article snuck it in. Probably the constant use of the word "surveillance" was an early tell haha.
Years ago, I worked on a site where we constantly had requests from the non technical side of the company to make the site load faster. We were perplexed in engineering. The site loaded and was ready for us in less than a fraction of a second.
Eventually we realized that every dev ran ubo, and tried loading the site without it. It took about 5 seconds. Marketing and other parts of the company had loaded so much crap into GTM that it just bogged everything down
Tor browser for everyday browsing (has no script preinstalled). So onion provides double Vpn. Regularly closed down so history cleared.
Safari in private mode and lockdown mode for when tor won't work (tor ip blocked/hd video that is too slow to stream on tor). Safari Isolation in private mode is excellent, you can use two tabs with, say emails, and neither will know other is logged in.
Safari non private for sites I want available and in sync across devices.
Firefox in permanent private mode with ublock origin for when safari lockdown mode causes issues. (Bizarely Firefox containers doesn't work in private so no isolation across tabs).
Chromium for logged into Google stuff.
Chrome for web development.
Plus opt out for everything possible inc targeted ads.
I rarely see ads of anything I would want to buy, and VPN blocks most of it at its DNS.
Beyond that, anything else would be too much effort for me.
The advertising companies I'm sure know I am not susceptible to impulse buy on ads, I research and seek vfm so not really their target.
Do you just... log back in to Hacker News every day?
I downloaded the Mullvad browser (basically Tor without the onion protocol part) but having no way to save passwords ended up making it unusable for me
I don't think this article makes a good case for why you should.
>The more of us who incapacitate Google's analytics products and their support mechanism, the better. Not just for the good of each individual person implementing the blocks - but in a wider sense, because if enough people block Google Analytics 4, it will go the same way as Universal Google Analytics. These products rely on gaining access to the majority of Web users. If too many people block them, they become useless and have to be withdrawn.
OK - but then also in the wider sense, if site owners can't easily assess the performance of their site relative to user behavior to make improvements, now the overall UX of the web declines. Should we go back to static pages and mining Urchin extracts, and guessing what people care about?
Analytics can have good uses, but these days it's mostly used to improve things for the operator (more sales, conversions, etc) and what's best for the website isn't always the best for the user. And so I block all that.
>Use uBlock Origin with JavaScript disabled, as described above, but also with ALL third-party content hard-blocked. To achieve the latter, you need to add the rule ||.^$third-party to the My Filters pane.
This is a worse way to implement uBO's "Hard Mode" (except with JS blocked), which has the advantage that you can easily whitelist sites individually and set a hotkey to switch to lesser blocking modes.
I don't know, the memetics of Surveillanceware or spyware mostly leads me to the belief that everything is weaponized to drain your money thru ads/marketing instead of the direct approach of stealing my money.
Is there a good way to collect basic analytics if you have a site you're hosting on GitHub pages? In such cases I'd rather not rely on Google Analytics if I don't have to.
Blocking Google Tag Manager script injection seems to have few side effects.
Blocking third party cookies also seems to have few side effects.
Turning off Javascript breaks too much.
You can then enable just enough JS to make sites work, slowly building a list of just what is necessary. It can also block fonts, webgl, prefetch, ping and all those other supercookie-enabling techniques.
The same with traditional cookies. I use Cookie AutoDelete to remove _all_ cookies as soon as I close the tab. I can then whitelist the ones I notice impact on authentication.
Also, you should disable JavaScript JIT, so the scripts that eventually load are less effective at exploiting potential vulnerabilities that could expose your data.
> Used as supplied, Google Tag Manager can be blocked by third-party content-blocker extensions. uBlock Origin blocks GTM by default, and some browsers with native content-blocking based on uBO - such as Brave - will block it too.
> Some preds, however, full-on will not take no for an answer, and they use a workaround to circumvent these blocking mechanisms. What they do is transfer Google Tag Manager and its connected analytics to the server side of the Web connection. This trick turns a third-party resource into a first-party resource. Tag Manager itself becomes unblockable. But running GTM on the server does not lay the site admin a golden egg...
By serving the Google Analytics JS from the site's own domain, this makes it harder to block using only DNS. (e.g. Pi-Hole, hosts file, etc.)
One might think "yeah but the google js still has to talk to google domains", but apparently, Google lets you do "server-side" tagging now (e.g. running a google tag manager docker container). This means more (sub)domains to track and block. That said, how many site operators choose to go this far, I don't know.
What if we could resolve every domain to 0.0.0.0 by default at the start. When visiting a website manually through the browser's URL bar it would automatically be whitelisted. Clicking links would also whitelist the domain of the link only. Sure you'd have to occasionally allow some 3rd party domains as well. Guess it would be cumbersome at first but after a while it would be pretty stable and wouldn't require much extra attention.
We had a disgusting number of tags on some of our customer pages and a few dozen of them start to have effects on page load, especially if you were still on HTTP 1.1.
Edit: looks like this might exist already: https://addons.mozilla.org/en-US/firefox/addon/adnauseam/
Between this and "track me not" i've been fighting back against ads and connecting my "profile" with any habits since 2016 or so. I should also note i have pihole and my own DNS server upstream, so that's thiry-eight grand in ad clicks that got through blacklists.
https://www.trackmenot.io/faq
I manage a Google Ads account with a $500,000 budget. That budget is spent on a mix of display ads, google search, and youtube ads.
If I knew that 10% of our budget was wasted on bot clicks, there's nothing I can do as an advertiser. We can't stop advertising... we want to grow our business and advertising is how you get your name out there. We also can't stop using Google Ads - where else would we go?
$38,000 in clicks boosts Google's revenue by $38k (Google ain't complaining). The only entity you're hurting are the advertisers using Google. Advertisers might see their campaigns performing less well, but that's not going to stop them from advertising. If anything, they'll increase budgets to counteract the fake bot clicks.
I really don't understand what Ad Nauseam is trying to achieve. It honestly seems like it benefits Google more than it hurts them. It directly hurts advertisers, but not enough that it would stop anyone from advertising.
Google has a system for refunding advertisers for invalid clicks. The $500k account that I manage gets refunded about $50/month in invalid clicks. I'm guessing if bot clicks started making a real dent in advertiser performance, Google would counter that by improving their bot detection so they can refund advertisers in higher volumes. If there's ever an advertiser-led boycott of Google Ads, Google would almost certainly respond by refunding advertisers for bot clicks at much higher rates.
You don't have to buy privacy violating ads. You don't have to buy targetted ads
Google is part of the problem, but they're neither the only ones nor best to target through bottom-up approaches.
> It directly hurts advertisers, but not enough that it would stop anyone from advertising.
You know the saying about XML - if it doesn't solve the problem, you are not using enough of it.
> there's nothing I can do as an advertiser. We can't stop advertising...
We know. The whole thing is a cancer[0], a runaway negative feedback loop. No single enlightened advertiser can do anything about it unilaterally. Which is why the pressure needs to go up until ~everyone wants change.
--
[0] - https://jacek.zlydach.pl/blog/2019-07-31-ads-as-cancer.html
They already have methods to detect a lot. Like you said yourself, customers have no alternative, so why would they refund money they don't have to?
> The only entity you're hurting are the advertisers using Google.
That’s fine. Advertising is cancer. Reducing advertisers’ ROI is good too.
You don’t hate ads if you’re spending $500k on them. You just hate receiving ads, which makes you hypocritical.
No matter how secure your browser setup is, Google is tracking you. By filling their trackers with garbage, there's less that can personally identify you as an individual
https://web.archive.org/web/20200601034723/https://www.macob...
Carter invented it and got paid so they can bury it. Must be good tech.
Or.. you know.. offering a quality product?
GP fights agains ads, not Google. And not being able to win 100% of the gain shouldn’t restrain someone from taking action it they consider the win share worth the pain.
> $38,000 in clicks boosts Google's revenue by $38k
You should include costs here, and if (big if) a substantial part of the clicks comes from bots and get refunded, the associated cost comes on top of the bill. At the end the whole business is impacted. I agree 50/50k is a penny through.
> I hate ads […] I manage a Google Ads account
[no cynism here, I genuinely wonder] how do you manage your conscience, mood and daily motivation? Do you see a dichotomy in what you wrote and if so, how did you arrive to that situation? Any future plan?
I’m asking as you kind of introduce the subject but if you’re not willing to give more details that’s totally fine.
Man scape? Nah, generic women's razers. Pcbway? Nope. JLCPCB.
Screw your ads. Find a better way.
https://adnauseam.io/
Chrome banned it from their add on store but it can still be installed manually
While these were almost always very easy tickets to do, they were just one more interruption for us and a blocker for the stakeholders, who liked to have an extremely rapid iteration cycle themselves.
GTM was a way to make this self-service, instead of the eng team having to keep this updated, and also it was clear to everyone what all the different trackers were.
This is not unreasonable! People spend a lot of money on ads and would like to find out if and when they work. But people act like its an unspeakable nebulous crime but this is probably the most common case by miles.
It’s used by marketing people to add the 1001 trackers they love to use.
It doesn't track things by itself. It just links your data to other tools like Google Analytics or Facebook Pixel to do the tracking.
This kind of data lets businesses do stuff like send coupon emails to people who left something in their cart.
There are lots of other uses. Basically, any time you want to add code or track behavior without dealing with a developer.
> Whilst Google would love the general public to believe that Tag Manager covers a wide range of general purpose duties, it's almost exclusively used for one thing: surveillance.
The thing is - with everything - it's never easy to have strong principles. If it were, everyone would do it.
Using Firefox Add-Ons on a "smartphone" sucks because one has to access every Add-On interface via an Extensions menu.
In that sense _all_ Add-Ons are only semi-functional.
I use multiple layers: uMatrix + NetGuard + Nebulo "DNS Rules", at the least. Thus I have at least three opportunities where I can block lookups for and requests to Google domains.
https://github.com/gorhill/uBlock/wiki/Advanced-settings
Having tried both, IMHO they do not do exactly the same thing. One is pattern-based, the other is host-based. As such, one can use them together, simultaneously.
Also, deleting everything when Firefox closes. It's a little annoying to re-login to everything every day, but again, they are banking on this inconvenience to fuck you over and I refuse to let them win. It becomes part of the routine easily enough.
https://noscript.net/
It has pretty advanced features but also basic ones that allow you to block scripts by source
I read HN and every site submitted to HN using TCP clients and a text-only browser, that has no Javascript engine, to convert HTML to text.
The keyword is "read". Javascript is not necessary for requesting or reading documents. Web developers may use it but that doesn't mean it is necessary for sending HTTP requests or reading HTML or JSON.
If the web user is trying to do something else other than requesting and reading, then perhaps it might not "work".
I won't browse the Internet on my phone without it, everything loads instantly and any site that actually matters was whitelisted years ago.
Many sites work without (some, like random news & blogs, work better). When a site doesn't work, I make a choice between temporarily or permanently allowing it depending on how often I visit the site. It takes maybe 5 seconds and I typically only need to spend that 5 seconds once. As a reward, I enjoy a much better web experience.
If you're spending 99% of your time on your favourite websites that you've already tuned the blocking on? Barely a problem.
On the other hand if your job involves going to lots of different vendors' websites - you'll find it pretty burdensome, because you might end up fiddling with the per-site settings 15+ times per day.
My personal devices block everything I can get away with
Eventually we realized that every dev ran ubo, and tried loading the site without it. It took about 5 seconds. Marketing and other parts of the company had loaded so much crap into GTM that it just bogged everything down
VPN so constantly changing ip.
Tor browser for everyday browsing (has no script preinstalled). So onion provides double Vpn. Regularly closed down so history cleared.
Safari in private mode and lockdown mode for when tor won't work (tor ip blocked/hd video that is too slow to stream on tor). Safari Isolation in private mode is excellent, you can use two tabs with, say emails, and neither will know other is logged in.
Safari non private for sites I want available and in sync across devices.
Firefox in permanent private mode with ublock origin for when safari lockdown mode causes issues. (Bizarely Firefox containers doesn't work in private so no isolation across tabs).
Chromium for logged into Google stuff.
Chrome for web development.
Plus opt out for everything possible inc targeted ads.
I rarely see ads of anything I would want to buy, and VPN blocks most of it at its DNS.
Beyond that, anything else would be too much effort for me.
The advertising companies I'm sure know I am not susceptible to impulse buy on ads, I research and seek vfm so not really their target.
Do you just... log back in to Hacker News every day?
I downloaded the Mullvad browser (basically Tor without the onion protocol part) but having no way to save passwords ended up making it unusable for me
>The more of us who incapacitate Google's analytics products and their support mechanism, the better. Not just for the good of each individual person implementing the blocks - but in a wider sense, because if enough people block Google Analytics 4, it will go the same way as Universal Google Analytics. These products rely on gaining access to the majority of Web users. If too many people block them, they become useless and have to be withdrawn.
OK - but then also in the wider sense, if site owners can't easily assess the performance of their site relative to user behavior to make improvements, now the overall UX of the web declines. Should we go back to static pages and mining Urchin extracts, and guessing what people care about?
If the frontend automatic js is blocked, it doesn’t matter.
https://github.com/gorhill/uBlock/wiki/Blocking-mode
https://github.com/gorhill/uBlock/wiki/Blocking-mode:-hard-m...
I thought the term was spyware.
Surveillanceware almost sounds like something necessary to prevent bad stuff. Is this corporate rebranding to make spyware software sound less bad?
https://noscript.net
You can then enable just enough JS to make sites work, slowly building a list of just what is necessary. It can also block fonts, webgl, prefetch, ping and all those other supercookie-enabling techniques.
The same with traditional cookies. I use Cookie AutoDelete to remove _all_ cookies as soon as I close the tab. I can then whitelist the ones I notice impact on authentication.
Also, you should disable JavaScript JIT, so the scripts that eventually load are less effective at exploiting potential vulnerabilities that could expose your data.
Doing that for years
> Used as supplied, Google Tag Manager can be blocked by third-party content-blocker extensions. uBlock Origin blocks GTM by default, and some browsers with native content-blocking based on uBO - such as Brave - will block it too.
> Some preds, however, full-on will not take no for an answer, and they use a workaround to circumvent these blocking mechanisms. What they do is transfer Google Tag Manager and its connected analytics to the server side of the Web connection. This trick turns a third-party resource into a first-party resource. Tag Manager itself becomes unblockable. But running GTM on the server does not lay the site admin a golden egg...
By serving the Google Analytics JS from the site's own domain, this makes it harder to block using only DNS. (e.g. Pi-Hole, hosts file, etc.)
One might think "yeah but the google js still has to talk to google domains", but apparently, Google lets you do "server-side" tagging now (e.g. running a google tag manager docker container). This means more (sub)domains to track and block. That said, how many site operators choose to go this far, I don't know.
https://developers.google.com/tag-platform/tag-manager/serve...
> <script async src="https://www.googletagmanager.com/gtag/js?xxxxxxx"></script>
I am going to use this for sure, but it is a little ironic.
This GitHub repo seems way more up-to-date: https://github.com/StevenBlack/hosts