Entries tagged 'syndication'
remember amphetadesk?
i still get over a thousand requests a day for feeds that no longer exist from amphetadesk users. the particularly astute may notice that it has been over four years since the last release of amphetadesk.
unfortunately, that’s only 20% of the requests for these long-dead feeds that i get every day.
welcome back
martin jansen pointed out that my feed wasn’t showing up as updated on bloglines, and it turns out to be because i had broken the conditional get handling for the feed. oops.
so if you’ve been relying on bloglines, you’ve missed a lot.
feedtree is a peer-to-peer system for distributing weblog updates, bearing at least a passing resemblance to what i described once upon a time.
i wonder how well it would handle having the livejournal firehose hooked up to it, and if it is of any use to a service like pubsub that would want to get everything.
ben writes more about what sixapart is up to with the atom stream.
clearly sixapart is up to great evil, because they are not using rss and avoiding using the existing cloud and feedmesh names. i demand swift angry mob justice. i’ve even heard they have a special room in their new offices where they smash kittens with hammers.
brad implemented a stream of atom updates from livejournal, which is sort of like the blo.gs cloud interface on steroids. nifty.
foo camp is this weekend, so that means that the feedmesh thing is a year old now. and speaking of that, put me in the jason shellen “not invited, but whatever” camp and not the russell beattie “not invited, and they kicked my puppy” camp.
kevin roderick passes on the old news that the los angeles times is working with another company to build an rss aggregator. what a terrible idea. the first step for the times should be to publish their own damn rss/atom feeds.
when i graduated college (ten years ago!), one of the places i applied to work was the times. they didn’t get back to me before i had found a job. while at the gym this morning, all that was going through my head is how much of a blast i could have at a place like the times if simply given a mandate to kick ass. (another is that i am probably way more qualified to do that now than i would be even if i had been working at the times for that last ten years.)
if it’s a great time to be an entrepreneur, it should also be a great time for everyone trying to do kick-ass things on the web. if i were at the times, and the team at a paper from a small town in kansas was continuously out-innovating me, i’d go nuts. especially when they have have released their web-building framework as open-source, and the best thing i’ve got going is an rss aggregator that is going to suck being built by an external company.
blo.gs has been acquired by yahoo!
the sale of blo.gs has been completed, and i'm proud to announce that yahoo! has acquired the service. as of right now, give or take a few minutes, yahoo! is running blo.gs.
this is the sort of good home that i was looking for — yahoo! obviously has the resources to run and improve blo.gs in pace with the incredible growth of blogs (and syndication in general), and in talking with them it was also clear that we had some of the same vision for the future of the service and the ping/notification infrastructure.
for users of the website and the cloud interface, nothing much is changing. the service will continue to be completely open, and both yahoo! and i hope you continue to use it and help it grow.
even though i’ll no longer be operating blo.gs, i'm not going to disappear from the community. i’m still very interested in blogging and syndication, and believe that blo.gs will continue to have a major impact as a key player in the evolving ping and blogging infrastructure.
some people have asked about the privacy policy during the transition. yahoo! is keeping the blo.gs privacy policy. the data collected on blo.gs will continue to be subject to that privacy policy and you will be given the opportunity to consent to future changes.
announcing scraped.us
i finally got around to moving the feeds for the few sites i am still scraping to a dedicated domain, over at scraped.us. i also added atom versions of the feeds. the old feeds are redirected to the new ones.
there’s no special reason for this, other than it was annoying to process stats for this site because of all the noise from those feeds. now i can more easily isolate that traffic.
no comment
from “rss subscription central?” at internetnews.com:
"Dave [Winer] knows a lot about getting technology people to work together," [Jupiter Research analyst Gary] Stein said. He suggested Winer could tie his efforts to Feedmesh, an initiative that aims to tie all feeds together into a single data source that all aggregators could use.
subscriber counts for los angeles times feeds
i thought i’d do a quick count of how many people are subscribed to each of the scraped the los angeles times news feeds i provide. this is based on unique ip addresses and the bloglines report of subscribers.
- world news: 1745
- national politics: 284
- california politics: 1201
- commentary: 210
- company town: 61
- technology: 428
- food: 102
suckers
regular sucking schedule is a new blog from glenn fleishman about syndication bandwidth issues. i decided to go ahead and implement conditional GET handling for my rss feeds, and make the 403 response for people who have been blocked include a tiny rss feed that tells them they are a jerk (in a few more words). it turns out the throttling i had implemented many months ago had a much higher time limit than intended (it is about one hour, it was almost two days — oops!). i guess it is telling that it was broken that way for months, and nobody ever complained.
it’s not really a matter of bandwidth for me, i’m just very annoyed at people who feel the need to suck down feeds every five minutes when they are only ever updated once per hour. i don’t care if the bandwidth is minimal (now) because of conditional GET handling, you’re still being a jerk as far as i’m concerned.
(and don’t get me started about the idiot at purdue who has been happily eating 403 responses for over two years.)
i wonder if i could use radio userland’s bugs in handling html entities in rss feeds to automatically unsubscribe the twenty or so radio users who keep pounding away at feeds that are dead. is there some localhost:5335-style url that i can force them to hit that would unsubscribe them from the feed?
assuming the decoding bug hasn’t been fixed, and all of radio’s preferences really are web-accessible, i'm sure there's all sorts of fun that could be had. (it smells like a big gaping hole in radio’s security. think you can trust all of the people you’re getting news from just because they've been reliable so far? think again.)