?

Log in

No account? Create an account
The Grumpy Old Man
markm
.:.::. .:.:..:.

August 2015
            1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31

The Grumpy Old Man [userpic]
Ideas for the Internets, Part 1

OK. So I've got this RSS reader, right? And I'm subscribed to a buncha different feeds, right? And some of the feeds "overlap", right?

Overlap.

OK, I mean that you'll see "Teenage Girl Rushed To Hospital After Caffeine Overdose" and you'll say "Yikes, I hope she's OK." And then you'll see it again. And then again. And again. And all the links are pointing to, say, the Daily Mail, for example.

So how about a "smart RSS" which can group the stories that all point to the same URL? It'd take a bit of parsing and some smahts to work it out, but I don't see anything technical that would stop it from being implemented.

Except my laziness, of course. So I throw the idea out to you, good reader, in the hope that it will inspire.

Comments

I was actually thinking about something similar - I've got a bunch of people I read on Livejournal, and a bunch of people I read through my RSS reader, and some of them mirror most of their posts to a separate blog which means I have to see them twice. Worse are people who only mirror posts in a certain category, which means I still have to read both to get everything they write, but I'm guaranteed to see some duplicates.

Then there's the problem of an RSS feed barfing all over itself as it changes the datestamps on everything published in a year so the RSS reader shows them all as new again. So I was thinking of making an RSS reader that stored a hash of every post so it could choose not to show the same content again, even if the first time it was seen was a completely unrelated feed. (You'd need some sort of fuzzy matching to compensate for different formatting and things, probably. That would be a place to add URL checking in.)

I'll buy that for a dollar!

(Alternatively: URL?)