Re: [dev] sfeed: a simple RSS and Atom parser and reader

From: Hiltjo Posthuma <hiltjo_AT_codemadness.org>
Date: Sun, 5 Aug 2012 16:35:22 +0200

On Sun, Aug 5, 2012 at 4:11 PM, pancake <pancake_AT_youterm.com> wrote:
> I wrote rss2html with my own xml parser and http protocol (0deps) so many years ago to read my feeds.
In a previous version I had my own hacky XML parser, but it was too
hard to manage alot of corner cases imho (CDATA, HTML in XML
specificly (eeew)). Expat also handles state while parsing a buffer. I
might rewrite XML parsing though if I find a good alternative.

I like to use curl because it handles https, http redirection and also
allows me to pass the date of the latest update so HTTP caching will
work too. But curl can easily be replaced by wget or fetch though.

> Actually, the only useful feature was the 'planet' option which sorts/merges all your feeds in a single timeline.
You can specify multiple feeds in a config file and run sfeed_update
with this config file as a parameter. Then pipe it through sfeed_html
.

> The html output of my tool supports templates so i use it to create a planet.foo website to read news.
I don't support templates, it's just hard-coded in sfeed_html.c atm.

> I end up using twitter. RSS is so retro.
I actually follow some people via twitter with RSS. I don't use
twitter though. You can for example use the url:
https://api.twitter.com/1/statuses/user_timeline.rss?include_rts=true&screen_name=barackobama&count=25

> I also wanted to have a way to keep synced my already read links. But that was a boring task.

Atm I just mark all items a day old or newer as new in sfeed_html and
sfeed_plain. In your browser visited links will ofcourse be coloured
differently.
Received on Sun Aug 05 2012 - 16:35:22 CEST

This archive was generated by hypermail 2.3.0 : Sun Aug 05 2012 - 16:36:03 CEST