I’ve given up on Twitter.
I’m way older and balder than last time. I also have a 4-year-old, so I’m sitting this one out. It looks like a PyBlosxom plugin would be a Saturday afternoon project or so, but I don’t really have Saturday afternoons free anymore.
Every now and then, you need to stress test your tools.
Spotted in the wild last week: 3 of the most common "viral" ad-hoc wireless networks while sitting at a single spot.
Like most people who waste a bit too much time online, I’ve reached the point where I have far too many feeds and accounts on “social” websites to keep track of. I tried creating a “life feed” using Yahoo Pipes, but it’s a bit of a PITA because of the varying quality of the feeds produced by these applications (i.e. broken or at least suspect date handling, bad use of GUIDs, etc., and don’t get me started on how the various deficiencies in RSS contribute to this mess) and the fun and games involved in trying to use a graphical scripting environment in a browser.
I understand that Facebook will do something like this, but I’ve vowed to be the last person on earth to get a Facebook account — my (perhaps unfair) opinion is that FaceBook is like MySpace, only without musicians — so I started looking elsewhere. I tried out Mugshot, but something about the “feel” of the service seemed off to me — it seemed like way too much work to get my various feeds and memberships integrated.
I’ve been seeing a ton of probes on UDP port 26185. Google has been unhelpful. Anyone else seeing these or know what they mean?
addendum: It looks like the UDP probes are sometimes paired with a TCP probe, like so:
Considering this was one of the original use cases for the Atom Syndication format and api, this should be really, really easy, right? I have a valid Atom 1.0 feed that I want to pour into a new Wordpress installation. In theory, I should be able to basically just pipe the Atom feed into the Atompub endpoint, right? Has anyone actually done this on Wordpress 2.3.x? I’ll add anything I find to this entry for the sake of others wanting to do the same thing.
Maybe it’s just because I’m a blogger who never posts (okay, that’s almost certainly it) but recent arguments that CAPTCHAs have failed as a means of stopping blog comment spam befuddle me. Honestly, I used to get dozens of comment spams a day (sometimes hundreds) and since implementing CAPTCHAs I get maybe one a month. It’s true that I get almost no legitimate comments either, but like I said, that’s mostly because I wrote maybe a dozen real entries in all of 2006…
“Ever get the feeling you’ve been cheated?”
Syndication politics are every bit as twisted as any soap opera you’ll see on daytime television. Only without the sex. And with a bunch of bearded fat guys in place of the pretty models.
Adam Engst of TidBITS spends some time talking about the retirement of the Info-Mac Network. I’ll pour a little on the ground for Info-Mac — it was one of the first internet services I ever used. Back when I was a freshman at university (1985) and started seriously using Macs for the first time, the Info-Mac mailing list was probably the first mailing list subscription I ever had. It was probably the closest thing to a worldwide Mac user group in existence at this time. About once a day, a mailing of Mac questions and answers was sent out, along with archive listings for various freeware programs (shareware didn’t exist yet. Neither did spam.)
At this time, I was mostly using lab computers around campus. These machines were networked to the campus network using “line drivers”, which were little serial dongles that tied back to an SCP (I think it stood for Secondary Communications Processor) in the lab’s wiring closet at a blazing 19200bps. Those SCPs tied into the campus network. What this meant is that you could attach to one of the campus mainframes, UM or UB, and do stuff in a VT100 emulator. You could download from the Info-Mac archive to your mainframe account using FTP. You'd then have to use Kermit in your terminal emulator to actually get it down from the mainframe to a local machine. At some point, they deployed a short-lived standard called SLFP that let you do TCP/IP over phone lines (1200 or 2400 bps) or over the faster “direct” connections in the labs. I remember using a program called Macnet SLFP to download files from Info-Mac. We were amazed at the blazing speeds of the downloads — at 19.2k, even the huge programs of the day (300-500 kb) came down in just a few minutes.
The majority of the software I used to use in those days came via Info-Mac (go sumex-aim.stanford.edu!), so it has a warm place in my heart.
People have been wanting this ever since Google News launched, and now they’ve finally added the ability to subscribe to news search results, with no more need for hacks or screenscraping. Here’s a feed of news that contains the name of my hometown. Mom, if you click that link in Safari, you can follow the local news even from Florida. :)
As for complaining about one’s pet format not getting exclusive billing, stop that, it only makes you look like a ranting grump.
I had my newly 2.0 PSP with me all day today as we did our usual Saturday running around. One thing is that you can use the network setup config screen as a sort-of ghetto Wi-Fi sniffer. :) It’ll list any networks it detects, along with the SSID, signal strength (as a percentage, of what, I wonder), and the type of security (currently none, WEP, and WPA-TKIP are detected.) There were wireless networks nearly everywhere we stopped today, about half of them secured. 802.11 is truly ubiquitous nowadays.
Anyway, a truly portable web browsing device is really a big deal when teamed up with Wi-Fi everywhere. I spent some time sipping a lemonade at my favorite free Wi-Fi spot and casually browsing on the PSP, and it’s a much less awkward affair on a 10-ounce handheld than it is schlepping around an 8 pound laptop, unfolding it on the table, etc. Don’t talk to me about browsing on mobile phones, either — that’s an exercise in purest pain in comparison (the PSP’s bright, sharp, wide screen makes all the difference.)
I haven’t written anything about the “shipment” of Atom 1.0 because, well, I haven’t been writing anything, but, of course, I’m excited about it. A gent named Sam Pearson has been beavering away on supporting 1.0 in Blosxom. My current plans are to cut my 0.3 feed over as soon as I see a few more of the big aggregator vendors ship versions that support it.
Prompted by some discussions I've had this week, I've got a question for some of the less geeky of you out there. In this context, I'm referring to people who, generally, neither know nor particularly care about the "plumbing" of websites.
If you could answer in the comments, I'd really appreciate it -- this will make the site better, in the long run.
Do you currently use a news aggregator (standalone or through a portal, for example My Yahoo) to follow lots of sites without having to visit them in your web browser?
If you do use an aggregator, how did you find out about website syndication?
What do you think about the current process of finding / subscribing to a new syndicated feed? Do you look for 'XML' or 'RSS' icons, or specific icons for syndication services? Do you just let your browser or aggregator notifiy you when it detects a newsfeed?
Surely, the fastest way to generate a link from me is to bag on little, orange, stupid XML icons. I have absolutely no problem with proving myself completely predictable by linking Anne van Kesteren's rant on the topic.
In the past somebody made the wrong choice the whole web now has to pay for by following the wrong route.
Dare Obasanjo had a great idea -- list the top 5 sites you read that still need syndicated feeds. Here are mine (heavily slanted towards entertainment, as most of computer geeks have already gotten the clue):
Just logged into my GMail account (2061 MB and counting...) and noticed that the signon form is now branded as Google Accounts. Looks like the single-signon thing for Google will be happening sooner than later.
Contest void where prohibited by law.