Also, don't forget to set up an RSS or Atom feed for your website. Contrary to the recurring claim that RSS is dead, most of the traffic to my website still comes from RSS feeds, even in 2̶0̶2̶5̶ 2026! In fact, one of my silly little games became moderately popular because someone found it in my RSS feed and shared it on HN. [1]
From the referer (sic) data in my web server logs (which is not completely reliable but still offers some insight), the three largest sources of traffic to my website are:
1. RSS feeds - People using RSS aggregator services as well as local RSS reader tools.
2. Newsletters - I was surprised to discover just how many tech newsletters there are on the Web and how active their user bases are. Once in a while, a newsletter picks up one of my silly or quirky posts, which then brings a large number of visits from its followers.
3. Search engines - Traffic from Google, DuckDuckGo, Bing and similar search engines. This is usually for specific tools, games and HOWTO posts available on my website that some visitors tend to return to repeatedly.
RSS is my preferred way to consume blog posts. I also find blogs that have an RSS feed to be more interested in actually writing interesting content rather than just trying to get views/advertise. I guess this makes sense—hard to monetize views through an RSS reader
It's funny back in the Google Reader days monetizing via RSS was quite common. You'd publish the truncated version to RSS and force someone to visit the site for the whole version, usually just in exchange for ad views. Honestly while it wasn't the greatest use of RSS it was better than most paid blogs today being ad-wall pop-up pay-gate nightmares of UX.
Now that browser developers did their best to kill RSS/Atom...
Does a Web site practically need to do anything to advertise their feed to the diehard RSS/Atom users, other than use the `link` element?
Is there a worthwhile convention for advertising RSS/Atom visually in the page, too?
(On one site, I tried adding an "RSS" icon, linking to the Atom feed XML, alongside all the usual awful social media site icons. But then I removed it, because I was afraid it would confuse visitors who weren't very Web savvy, and maybe get their browser displaying XML or showing them an error message about the MIME content type.)
I use RSS Style[1] to make the RSS and Atom feeds for my blog human readable. It styles the xml feeds and inserts a message at the top about the feed being meant for news readers, not people. Thus technically making it "safe" for less tech savvy people.
Browsers really should have embraced XSLT rather that abandoned it. Now we're stuck trying yet again to reinvent solutions already handled by REST [1].
Shout out to Vivaldi, which renders RSS feeds with a nice default "card per post" style. Not to mention that it also has a feed reader built in as well.
For a personal site, I'd probably just do that. (My friends are generally savvy and principled enough not to do most social media, so no need for me to endorse it by syndicating there.)
But for a commercial marketing site that must be on the awful social media, I'm wondering about quietly supporting RSS/Atom without compromising the experience for the masses.
With the lack of styling, I'm sorry to say I didn't notice the RSS icon at first at all. Adding the typical orange background to the icon would fix that.
Is there any reason today to use RSS over Atom? Atom sounds like it has all the advantages, except maybe compatibility with some old or stubborn clients?
Based on my own personal usage, it makes total sense that RSS feeds still get a surprising number of hits. I have a small collection of blogs that I follow and it's much easier to have them all loaded up in my RSS reader of choice than it is to regularly stop by each blog in my browser, especially for blogs that seldomly post (and are easy to forget about).
Readers come with some nice bonus features, too. All of them have style normalization for example and native reader apps support offline reading.
If only there were purpose-built open standards and client apps for other types of web content…
This is what I use. It’s on macOS too and amazing on both. Super fast, focused, and efficient.
It’s by far the best I’ve tried. Most other macOS readers aren’t memory managing their webviews properly which leads to really bad memory leaks when they’re open for long periods.
The question is, do you have this traffic because of RSS client crawlers that pre-loaded the content or from real users. I'm not pro killing RSS by the way, but genuinely doubtful.
> The question is, do you have this traffic because of RSS client crawlers that pre-loaded the content or from real users.
I have never seen RSS clients or crawlers preload actual HTML pages. I've only seen them fetching the XML feed and present its contents to the users.
When I talk about visitors arriving at my website from RSS feeds, I am not counting requests from feed aggregators or readers identified by their 'User-Agent' strings. Those are just software tools fetching the XML feed. I'm not talking about them. What I am referring to are visits to HTML pages on my website where the 'Referer' header indicates that the client came from an RSS aggregator service or feed reader.
It is entirely possible that many more people read my posts directly in their feed readers without ever visiting my site, and I will never be aware of them, as it should be. For the subset of readers who do click through from their feed reader and land on my website, those visits are recorded in my web server logs. My conclusions are based on that data.
> I have never seen RSS clients or crawlers preload actual HTML pages
Some setups like ttrss with the mercury plugin will do that to restore full articles to the feed, but its either on-demand or manually enabled per feed. Personally I dont run it on many other than a few more commercial platforms that heavily limit their feed's default contents.
Presumably some the more app based rss readers have such a feature, but I wouldnt know for certain.
We followed this practice at a Non-Profit I volunteered for some years ago. For us, it was motivated by a few reasons:
- we trained the community around us to look to our website first for the most recent news and information
- we did not want a social media platform to be able to cut us off from our community (on purpose or accident) by shuttering accounts or groups
- we did not want to require our users have accounts on any 3rd party platforms in order to access our postings
- but we still wanted to distribute our messaging across any platforms where large groups of our community members frequently engaged
Another aspect of our process that was specific to our situation and outside of POSSE - we only posted one topic/issue/announcement per blog post. We had a news letter that would summarize each of these. Many organizations like ours would post summaries of many things to a single blog post, basically the same as the newsletter. However, this was cumbersome. For example, if someone in the community had a question, it was much clearer to link to a single post on our site that answered the question AND ONLY answered that question. It made for much better community engagement, better search engine indexing, cleaner content management, and just a better experience for everyone involved.
One of the biggest steps down in Facebook history was their removal of RSS syndication. There was a time in the past when you could subscribe your Facebook account to external RSS feeds. The entries in those feeds would create new content on your "Facebook wall". This essentially let you use any third party that supported RSS to publish content into your Facebook feed.
Facebook removed that feature. The effect of this was that people had to create content within facebook instead of outside it. This reoriented the flow of content creation so that it must originate inside of Facebook, removing the ability to use FB as a passive consumer of content created in a workflow where the creators chose the entire flow.
IMHO this is one of the biggest steps down ever in FB history. It was one of the biggest attacks on the open web, and I'm sad to say that it mostly worked, and the internet at large is worse as a result.
I guess it happens when engineers stop driving decisions and the finance people take over. Won't be too good for the company's valuation if people can access the content elsewhere.
I guess that's why Discord is also locked down as much. They have community content that is inaccessible anywhere else but Discord.
I've restarted blogging last year, going from a handful of blog post to, publishing consistently. All content gets published on my blog first. I've seen an ~8x increase of traffic. I was affected by zero-clicks from Google's AI overview, but the bulk of my traffic now comes from RSS readers.
>the bulk of my traffic now comes from RSS readers.
I don't think this is correct unless you mean strictly the number of HTTP requests to your web server.
You were the 9th most popular blogger on HN in 2025.[0] Your post says you have about 500 readers via RSS. How can that represent more readers than people who read your posts through HN? I'd guess HN brought you about 1M visitors in 2025 based on the number of your front page posts.
You are right, my statement may be a bit misleading or incomplete. The ~500 readers are not just local rss bots, but they include aggregate RSS bots. For example, I see the feedly reporting ~200 subscribers, newsreader reporting 50 subscribers, feedbin, etc. Each of those only have between 1 to 3 ip addresses. So for each RSS bot, there are an arbitrary number of actual users reading. I can't track those accurately.
However, users can click on an RSS feed article and read it directly on my blog. These have a URL param that tells me they are coming from the feed. When an article isn't on HNs frontpage, the majority of traffic is coming from those feeds.
By the way, thank you for sharing this tool. Very insightful.
These are impressive metrics, are you able to make a living off of your 10M views?
I'm planning to leave my job this year and focus on content, mostly have been considering YouTube, but if blogging can work too, might consider that as well
Not even close to making a living! It does pay for my server though which costs $15 a month. YouTube gives you much more visibility. I'll try to compile the numbers from my single Carbon ad placement and the donations I receive from readers.
But I also don't think I have the process in place to do Blog, YouTube, Podcast and hold a full time job. Yes the job is my source of income.
Im a firm believer that data collected that doesnt have a clear action associated with it is meaningless - and i couldnt think of an action i would take if my traffic goes up or down on my personal blog - but tbh i mainly blog for myself not really to build an audience, so our objectives might differ
There are some actions you can take. For example, when my traffic plummeted, I saw through my logs that search engines were trying to access my search page with questionable queries. That's when I realized I became a spam vector. I gave a better rundown through the link I shared.
Just an FYI, the data collected to make those conclusion was through the server log (Apache2 in my case). So if you run your own server or VPS, you already have this information.
This strategy is an alternative to PESOS (Publish Elsewhere, Syndicate (to your) Own Site) [0]. I really like this read on the indieweb website, it explains well why adopt this strategy for federation and emphasizes that "Friends are more important than federation", something a lot of nerds and hackers forget when defending their ideals.
You can have both! POSSE to post multiple places. PESOS to pull in anything posted directly in other places, i.e. anything that didn't originate from a POSSE post.
I really like this philosophy. I've been using it for a couple of years now - everything goes on my personal site, then I post links on Mastodon, Bluesky and Twitter and sometimes (if I remember to do so) LinkedIn, plus copy and paste it all into a Substack email every week or so.
I really need to automate it though - hard on Twitter and LinkedIn but still pretty easy for Bluesky and Mastodon.
Have you looked at https://posseparty.com/ as a possible option? Supports integrations with those platforms and more, and "all" it needs is an Atom feed!
Ooh I hadn't seen that. I'm still hung up on character limits - I want to make sure the summary I include isn't truncated with ... and is instead the right length for that particular platform.
I know it’s gotten some push back but to be honest I’m fond of the more manual approach that you take on HN.
While I don’t follow nor am I necessarily interested in everything that you cover, I do appreciate the presence of having something like a local “correspondent” around when you do appear to provide trails of supplementary commentary. The lengths that I see you go through to do all of this tastefully and transparently are not unnoticed.
I definitely won't be automating submission to places like HN.
I figure if you chose to follow me on Bluesky/Twitter/Mastodon/LinkedIn there's no ethical issue at all with me automating the process of having my new blog entries show up in my feeds there, as opposed to copying-and-pasting the links by hand.
No, no, perhaps you misunderstood me. I like how you link to your own writing in the discussions here. I don't suspect you to start automating that.
To tell you the truth I came to this actual submission to express my apathy toward the ‘POSSE’ concept but I saw you here and figured that I could somehow voice that feeling while simultaneously making mention of a sharing method that I do find worthwhile and more personable. And not an easy thing to pull off.
How much of your traffic comes from HN as opposed to the other platforms?
If we had stuck with standard semantic web microformats, RSS/Atom syndication, FOAF-ish graphs, URIs for identity but also anonymous pubkey identities with reputation graphs - we could have built an entirely distributed social media graph that worked like email.
But alas, Facebook pushed forward too fast to counter.
There's still a chance, but the software needs to focus on simplicity and ease of use. Publishing blobs of signed content that can be added to anything - HTML pages, P2P protocols, embedded into emails and tweets - maybe we can hijack the current systems and have distributed identity and publishing take over.
I take this approach with everything I post, though I only syndicate to Mastodon. I have an RSS and JSON feed for each of the content types (they all have different schema) on my site: posts, links, books, movies, concerts, status updates and a combined feed. I also maintain an ICS calendar subscription of upcoming album releases.
These items, in turn, can be optionally syndicated to Mastodon when published. For status updates, I have a field that supports Mastodon-specific text (for mentions and so forth).
I also expose an oembed endpoint that returns the appropriate data for each content type for platforms that support it.
Everything I read is from RSS feeds I follow via freshRSS. Links are saved to linkding and are transformed into TTS "podcasts" that are sent to audiobookshelf.
I've used Twitter to publish my thoughts for years. In the beginning, I'd write multiple threads to get my points across.
Then, when Twitter started supporting longer tweets, I started publishing essays and it got the job done.
But at the end of each year, it was really hard to trace all my posts and write reviews about them. That's exactly what brought me to POSSE. I've been maintaining my blog[1] since early 2020 and it feels really good to know that I own my stuff. Plus, over the years, it has opened up so many doors for me.
Too bad many of these walled-garden platforms have now started to demote posts if they contain external URLs. I'm battling that by posting the links as a comment to the original post, which contains a cover photo of the blog.
EchoFeed is a lovely service to enable this regardless of what service you use to publish on your own site (so long as it supports RSS/Atom/JSON). I've used it to good effect for my blog in the past.
Very similar to what I’m building with opal editor. The “site” is static markdown which lives and is stored in your browser, css, images, markdown and html. You can keep it as is with markdown or compile to html. From there you can easily push to vercel GitHub cloudflare netlify. Cheating the server less bit of it by using CORS proxies
I just started building my own website today with Django. I’m doing it because I just enjoy doing it. Most of my work is in data and ML infrastructure and it is just killing me. Working on the front end has opened my mind to possibility and given me new inspiration.
I love hn and was inspired by all the devs who have their own site. I was drowning in work, but put the Django architecture together on vacation, started putting things together today and it’s been a blast.
I don’t enjoy social media and was thinking to posse intrinsically.
I appreciate this post and the authors perspective.
I just wanted to learn how to create an enterprise grade web application. I read a book on Django last year and did a few tutorials and enjoyed it. I also deploy infra on gcp and it works well there. It cost about $60/month for baseline hosting with light traffic/storage. I will probably use it for an interface for some of my ml projects. I was also looking into dart/flutter a much steeper learning curve for me personally.
This is pretty much how I began developing websites too. Except it was 2001 instead of 2026. And it was ASP (the classic ASP that predates ASP.NET) instead of Python. And I had a Windows 98 machine in my dorm room with Personal Web Server (PWS) running on it instead of GCP.
It could easily have been a static website, but I happened to stumble across PWS, which came bundled with a default ASP website. That is how I got started. I replaced the default index.asp with my own and began building from there. A nice bonus of this approach was that the default website included a server-side guestbook application that stored comments in an MS Access database. Reading through its source code taught me server-side scripting. I used that newfound knowledge to write my own server-side applications.
Of course, this was a long time ago. That website still exists but today most of it is just a collection of static HTML files generated by a Common Lisp program I wrote for myself. The only parts that are not static are the guestbook and comment forms, which are implemented in CL using Hunchentoot.
I'd like to have a POSSE setup for video with a landing page, a static image and transcript, a download button for very slowly downloading the video, metadata and links to instantly-available external copies so that I can channel as much of the server costs that video entails to the big platforms.
Has anybody written about adapting POSSE for videos?
> Syndication can be done fully automatically by the server
At the risk of stating the obvious: this can get tricky, many popular social media platforms restrict automated posting. Policies around automation and/or api usage can change often and may not even be fully public as some might overlap anti spam measures.
Buffer documents a number of workflows and limitations in their FAQs.
E.g. for a non-professional Instagram account, the user gets a notification to manually share a post via the Instagram app.
> you can prepare your post in Buffer, receive a notification on your mobile device when it’s time to post, then tap the notification and copy your post over to the social network to finish posting.
Postiz docs show that users can create an app on Facebook and use that key and it will auto post.
I guess using POSSE for Instagram forces you to either create a personal app on Facebook which is not easy or make your Instagram account a business account.
Havent been able to figure this out for Instagram - also the only social media that is still relevant for me.
(thankfully?) never got into twitter where it seems to be easy.
I also thought this and didn’t get the point.
But the link was published 2013 and maybe for users not used to personal blogs but only social media nowadays it’s worth mentioning…
I am not so sure. You need to speak in the native voice of each community. A LinkedIn post vs Tweet vs E-Mail are different. You need to get value from the network directly without expecting a click thru. A lot of engagement + authority happens via the network itself
I think it's more accurate to see blogging as a distinct channel from other types of social media + content marketing
I really want to implement this, but i havent been able to figure out how to do it for Instagram (the only social media that is really relevant in my friend circle) and whatsapp/signal groups other than doing it manually.
If anyone has tips, especially for Insta let me know...
This is my approach and I fully recommend it. My personal website is my canonical home address on the web. It has outlived a few platforms and many rounds of enshittification.
A few caveats:
- You will have different communities on each social network. Your personal website might be home to you, but to your users, it's not. You're just another creator on their platform of choice.
- Each community has its own vibe, and commands slightly different messaging. This is partly due to the format each platform allows. Each post will create parallel but different conversations.
- Dumping links is frowned upon. You should be a genuine participant in each community, even if you just repost the same stuff. Automation does not help much there.
- RSS and newsletters are the only audiences that you control, and they're worth growing. Everywhere else, people who explicitly want to follow you might never see your updates.
- You should own the domain you post to. This is your address on the internet, and it should stay yours
- People do check your personal website. I was surprised to hear friends and acquaintances refer to things I post on my website.
I've found that managing the conversations across venues is way harder than publishing to them, and POSSE doesn't address this except for one line about backfeeds/"reverse syndication", which most mainstream services either don't support or actively sabotage. It easily takes more effort to engage across services than to post across them.
How do you fellow HN'ers separate their online with their corporate identity and day job?
I cannot rid myself of the suspicion that your average boss is going to have a prying eye on your online activities and may even use them against you one way or another e.g. if you offer services/work on side projects that may in any way may compete w/ your employer.
I only work for small companies that don’t have any business interest in areas where I want to maintain independent side projects.
When a startup I was working at made a successful exit and got acquired my a major corporation that did have business interests which overlapped with my side projects, I refused the bonus contract and froze my side project activity until leaving about a year later.
Don’t get cute. Avoid side projects that compete with your employer, and disclose unrelated side projects properly so that your employer is forced to acknowledge them. Do what it takes to avoid entanglement, making sacrifices if necessary.
My experience is that bosses read my blog, then when they or a fellow manager need to hire someone, have reached out to me asking me to apply. So it cuts both ways - maybe your shitty boss sees you blogging and sharing your experience, but a good boss will see that and go "I want this passionate and curious person to work for me".
IDK, your average boss is just a dude who has bills to pay and mouths to feed. They don't really care what happens as long as you're not doing something stupid, especially visibly and on their time.
I've been doing this for years with my site, and it's brought me a lot of joy that I can go back and search my site for various posts I've made over the last decade across all the platforms I use - I have a more high friction setup, but that's because of my own terrible choices
It’s almost like HN is a great platform for the POSSE model!
Awesome share thanks for the link. Will send to a family member who is looking to gain viewership with their writing - they usually post on medium I think.
POSSE can be applied to more than just social networks, it can be used to disrupt every marketplace!
In fact, I’m building open source SaaS for every vertical and leveraging that to build an interoperable, decentralized marketplace.
Social media is a marketplace as well. The good being sold is people’s content and the cost you pay is with your attention. The marketplace’s cut is ads and selling your data.
This post like many recent ones like it, essentially wants the internet to go backwards to what it once was pre-LLMs [edit: and pre-concentration]. I'd like to suggest that you should follow through and go all the way to pre-internet itself, and rediscover handwriting, in-person local meeting groups, non-digital relationships, and using your hands not on a keyboard. Today I (with difficulty) left my macbook closed all day until this evening (and this comment). Small steps.
I understand this attitude but when I look back at my rural youth I just hear you telling me that I should have had no one to talk to at all about many things.
From the referer (sic) data in my web server logs (which is not completely reliable but still offers some insight), the three largest sources of traffic to my website are:
1. RSS feeds - People using RSS aggregator services as well as local RSS reader tools.
2. Newsletters - I was surprised to discover just how many tech newsletters there are on the Web and how active their user bases are. Once in a while, a newsletter picks up one of my silly or quirky posts, which then brings a large number of visits from its followers.
3. Search engines - Traffic from Google, DuckDuckGo, Bing and similar search engines. This is usually for specific tools, games and HOWTO posts available on my website that some visitors tend to return to repeatedly.
[1] https://susam.net/from-web-feed-to-186850-hits.html
Does a Web site practically need to do anything to advertise their feed to the diehard RSS/Atom users, other than use the `link` element?
Is there a worthwhile convention for advertising RSS/Atom visually in the page, too?
(On one site, I tried adding an "RSS" icon, linking to the Atom feed XML, alongside all the usual awful social media site icons. But then I removed it, because I was afraid it would confuse visitors who weren't very Web savvy, and maybe get their browser displaying XML or showing them an error message about the MIME content type.)
[1]: https://www.rss.style/
[1] https://tonysull.co/articles/mcp-is-the-wrong-answer/
[1]: https://rednafi.com
But for a commercial marketing site that must be on the awful social media, I'm wondering about quietly supporting RSS/Atom without compromising the experience for the masses.
Readers come with some nice bonus features, too. All of them have style normalization for example and native reader apps support offline reading.
If only there were purpose-built open standards and client apps for other types of web content…
It’s by far the best I’ve tried. Most other macOS readers aren’t memory managing their webviews properly which leads to really bad memory leaks when they’re open for long periods.
I have never seen RSS clients or crawlers preload actual HTML pages. I've only seen them fetching the XML feed and present its contents to the users.
When I talk about visitors arriving at my website from RSS feeds, I am not counting requests from feed aggregators or readers identified by their 'User-Agent' strings. Those are just software tools fetching the XML feed. I'm not talking about them. What I am referring to are visits to HTML pages on my website where the 'Referer' header indicates that the client came from an RSS aggregator service or feed reader.
It is entirely possible that many more people read my posts directly in their feed readers without ever visiting my site, and I will never be aware of them, as it should be. For the subset of readers who do click through from their feed reader and land on my website, those visits are recorded in my web server logs. My conclusions are based on that data.
Some setups like ttrss with the mercury plugin will do that to restore full articles to the feed, but its either on-demand or manually enabled per feed. Personally I dont run it on many other than a few more commercial platforms that heavily limit their feed's default contents.
Presumably some the more app based rss readers have such a feature, but I wouldnt know for certain.
- we trained the community around us to look to our website first for the most recent news and information
- we did not want a social media platform to be able to cut us off from our community (on purpose or accident) by shuttering accounts or groups
- we did not want to require our users have accounts on any 3rd party platforms in order to access our postings
- but we still wanted to distribute our messaging across any platforms where large groups of our community members frequently engaged
Another aspect of our process that was specific to our situation and outside of POSSE - we only posted one topic/issue/announcement per blog post. We had a news letter that would summarize each of these. Many organizations like ours would post summaries of many things to a single blog post, basically the same as the newsletter. However, this was cumbersome. For example, if someone in the community had a question, it was much clearer to link to a single post on our site that answered the question AND ONLY answered that question. It made for much better community engagement, better search engine indexing, cleaner content management, and just a better experience for everyone involved.
1000x yes to this! It can be really frustrating when a link takes me to FB, TW, IG, etc. - none of which I use.
Facebook removed that feature. The effect of this was that people had to create content within facebook instead of outside it. This reoriented the flow of content creation so that it must originate inside of Facebook, removing the ability to use FB as a passive consumer of content created in a workflow where the creators chose the entire flow.
IMHO this is one of the biggest steps down ever in FB history. It was one of the biggest attacks on the open web, and I'm sad to say that it mostly worked, and the internet at large is worse as a result.
I guess that's why Discord is also locked down as much. They have community content that is inaccessible anywhere else but Discord.
I published a write up just this morning: https://idiallo.com/blog/what-its-like-blogging-in-2025
I don't think this is correct unless you mean strictly the number of HTTP requests to your web server.
You were the 9th most popular blogger on HN in 2025.[0] Your post says you have about 500 readers via RSS. How can that represent more readers than people who read your posts through HN? I'd guess HN brought you about 1M visitors in 2025 based on the number of your front page posts.
[0] https://refactoringenglish.com/tools/hn-popularity/domain/?d...
However, users can click on an RSS feed article and read it directly on my blog. These have a URL param that tells me they are coming from the feed. When an article isn't on HNs frontpage, the majority of traffic is coming from those feeds.
By the way, thank you for sharing this tool. Very insightful.
I'm planning to leave my job this year and focus on content, mostly have been considering YouTube, but if blogging can work too, might consider that as well
But I also don't think I have the process in place to do Blog, YouTube, Podcast and hold a full time job. Yes the job is my source of income.
I want to add analytics to my blog too, haven't had any on my sites for about a decade.
Im a firm believer that data collected that doesnt have a clear action associated with it is meaningless - and i couldnt think of an action i would take if my traffic goes up or down on my personal blog - but tbh i mainly blog for myself not really to build an audience, so our objectives might differ
[0]: https://indieweb.org/PESOS
I really need to automate it though - hard on Twitter and LinkedIn but still pretty easy for Bluesky and Mastodon.
While I don’t follow nor am I necessarily interested in everything that you cover, I do appreciate the presence of having something like a local “correspondent” around when you do appear to provide trails of supplementary commentary. The lengths that I see you go through to do all of this tastefully and transparently are not unnoticed.
I figure if you chose to follow me on Bluesky/Twitter/Mastodon/LinkedIn there's no ethical issue at all with me automating the process of having my new blog entries show up in my feeds there, as opposed to copying-and-pasting the links by hand.
To tell you the truth I came to this actual submission to express my apathy toward the ‘POSSE’ concept but I saw you here and figured that I could somehow voice that feeling while simultaneously making mention of a sharing method that I do find worthwhile and more personable. And not an easy thing to pull off.
How much of your traffic comes from HN as opposed to the other platforms?
But alas, Facebook pushed forward too fast to counter.
There's still a chance, but the software needs to focus on simplicity and ease of use. Publishing blobs of signed content that can be added to anything - HTML pages, P2P protocols, embedded into emails and tweets - maybe we can hijack the current systems and have distributed identity and publishing take over.
These items, in turn, can be optionally syndicated to Mastodon when published. For status updates, I have a field that supports Mastodon-specific text (for mentions and so forth).
I also expose an oembed endpoint that returns the appropriate data for each content type for platforms that support it.
Everything I read is from RSS feeds I follow via freshRSS. Links are saved to linkding and are transformed into TTS "podcasts" that are sent to audiobookshelf.
Then, when Twitter started supporting longer tweets, I started publishing essays and it got the job done.
But at the end of each year, it was really hard to trace all my posts and write reviews about them. That's exactly what brought me to POSSE. I've been maintaining my blog[1] since early 2020 and it feels really good to know that I own my stuff. Plus, over the years, it has opened up so many doors for me.
Too bad many of these walled-garden platforms have now started to demote posts if they contain external URLs. I'm battling that by posting the links as a comment to the original post, which contains a cover photo of the blog.
[1]: https://rednafi.com
https://echofeed.app/
https://opaledx.com
https://github.com/rbbydotdev/opal
MIT and open source no documentation yet. But coming very soon
I love hn and was inspired by all the devs who have their own site. I was drowning in work, but put the Django architecture together on vacation, started putting things together today and it’s been a blast.
I don’t enjoy social media and was thinking to posse intrinsically.
I appreciate this post and the authors perspective.
It could easily have been a static website, but I happened to stumble across PWS, which came bundled with a default ASP website. That is how I got started. I replaced the default index.asp with my own and began building from there. A nice bonus of this approach was that the default website included a server-side guestbook application that stored comments in an MS Access database. Reading through its source code taught me server-side scripting. I used that newfound knowledge to write my own server-side applications.
Of course, this was a long time ago. That website still exists but today most of it is just a collection of static HTML files generated by a Common Lisp program I wrote for myself. The only parts that are not static are the guestbook and comment forms, which are implemented in CL using Hunchentoot.
Has anybody written about adapting POSSE for videos?
For example: This is rss for Simon Willison: https://bsky.app/profile/simonwillison.net/rss
Somewhat related, predictions for the future of the web by IWC contritbutors:
https://vhbelvadi.com/indieweb-carnival-round-up-dec-2025
At the risk of stating the obvious: this can get tricky, many popular social media platforms restrict automated posting. Policies around automation and/or api usage can change often and may not even be fully public as some might overlap anti spam measures.
0. https://buffer.com
1. https://github.com/gitroomhq/postiz-app
Buffer documents a number of workflows and limitations in their FAQs.
E.g. for a non-professional Instagram account, the user gets a notification to manually share a post via the Instagram app.
> you can prepare your post in Buffer, receive a notification on your mobile device when it’s time to post, then tap the notification and copy your post over to the social network to finish posting.
source: https://support.buffer.com/article/658-using-notification-pu...
I guess using POSSE for Instagram forces you to either create a personal app on Facebook which is not easy or make your Instagram account a business account.
I think it's more accurate to see blogging as a distinct channel from other types of social media + content marketing
A few caveats:
- You will have different communities on each social network. Your personal website might be home to you, but to your users, it's not. You're just another creator on their platform of choice.
- Each community has its own vibe, and commands slightly different messaging. This is partly due to the format each platform allows. Each post will create parallel but different conversations.
- Dumping links is frowned upon. You should be a genuine participant in each community, even if you just repost the same stuff. Automation does not help much there.
- RSS and newsletters are the only audiences that you control, and they're worth growing. Everywhere else, people who explicitly want to follow you might never see your updates.
- You should own the domain you post to. This is your address on the internet, and it should stay yours
- People do check your personal website. I was surprised to hear friends and acquaintances refer to things I post on my website.
But this is no longer available.
You have to copy and paste the article into Medium manually unfortunately.
I cannot rid myself of the suspicion that your average boss is going to have a prying eye on your online activities and may even use them against you one way or another e.g. if you offer services/work on side projects that may in any way may compete w/ your employer.
Anyone got experience to share in that regard?
Thinking about this famous precedent: https://news.ycombinator.com/item?id=27424195#27425041
When a startup I was working at made a successful exit and got acquired my a major corporation that did have business interests which overlapped with my side projects, I refused the bonus contract and froze my side project activity until leaving about a year later.
Don’t get cute. Avoid side projects that compete with your employer, and disclose unrelated side projects properly so that your employer is forced to acknowledge them. Do what it takes to avoid entanglement, making sacrifices if necessary.
Ask HN: Is starting a personal blog still worth it in the age of AI?
https://news.ycombinator.com/item?id=46268055
A website to destroy all websites
https://news.ycombinator.com/item?id=46457784
Awesome share thanks for the link. Will send to a family member who is looking to gain viewership with their writing - they usually post on medium I think.
https://github.com/searlsco/posse_party/blob/main/LICENSE.tx...
In fact, I’m building open source SaaS for every vertical and leveraging that to build an interoperable, decentralized marketplace.
Social media is a marketplace as well. The good being sold is people’s content and the cost you pay is with your attention. The marketplace’s cut is ads and selling your data.
Some people are just using it to post more garbage into the platforms they already were using
But it's not an either/or proposition.
Also, keep in mind that POSSE and this site predate LLMs by quite a bit.
Let's not throw out the baby with the bathwater