Every now and then, you need to stress test your tools.
Spotted in the wild last week: 3 of the most common "viral" ad-hoc wireless networks while sitting at a single spot.
Like most people who waste a bit too much time online, I’ve reached the point where I have far too many feeds and accounts on “social” websites to keep track of. I tried creating a “life feed” using Yahoo Pipes, but it’s a bit of a PITA because of the varying quality of the feeds produced by these applications (i.e. broken or at least suspect date handling, bad use of GUIDs, etc., and don’t get me started on how the various deficiencies in RSS contribute to this mess) and the fun and games involved in trying to use a graphical scripting environment in a browser.
I understand that Facebook will do something like this, but I’ve vowed to be the last person on earth to get a Facebook account — my (perhaps unfair) opinion is that FaceBook is like MySpace, only without musicians — so I started looking elsewhere. I tried out Mugshot, but something about the “feel” of the service seemed off to me — it seemed like way too much work to get my various feeds and memberships integrated.
I’ve been seeing a ton of probes on UDP port 26185. Google has been unhelpful. Anyone else seeing these or know what they mean?
addendum: It looks like the UDP probes are sometimes paired with a TCP probe, like so:
Considering this was one of the original use cases for the Atom Syndication format and api, this should be really, really easy, right? I have a valid Atom 1.0 feed that I want to pour into a new Wordpress installation. In theory, I should be able to basically just pipe the Atom feed into the Atompub endpoint, right? Has anyone actually done this on Wordpress 2.3.x? I’ll add anything I find to this entry for the sake of others wanting to do the same thing.
Maybe it’s just because I’m a blogger who never posts (okay, that’s almost certainly it) but recent arguments that CAPTCHAs have failed as a means of stopping blog comment spam befuddle me. Honestly, I used to get dozens of comment spams a day (sometimes hundreds) and since implementing CAPTCHAs I get maybe one a month. It’s true that I get almost no legitimate comments either, but like I said, that’s mostly because I wrote maybe a dozen real entries in all of 2006…
“Ever get the feeling you’ve been cheated?”
Syndication politics are every bit as twisted as any soap opera you’ll see on daytime television. Only without the sex. And with a bunch of bearded fat guys in place of the pretty models.
Adam Engst of TidBITS spends some time talking about the retirement of the Info-Mac Network. I’ll pour a little on the ground for Info-Mac — it was one of the first internet services I ever used. Back when I was a freshman at university (1985) and started seriously using Macs for the first time, the Info-Mac mailing list was probably the first mailing list subscription I ever had. It was probably the closest thing to a worldwide Mac user group in existence at this time. About once a day, a mailing of Mac questions and answers was sent out, along with archive listings for various freeware programs (shareware didn’t exist yet. Neither did spam.)
At this time, I was mostly using lab computers around campus. These machines were networked to the campus network using “line drivers”, which were little serial dongles that tied back to an SCP (I think it stood for Secondary Communications Processor) in the lab’s wiring closet at a blazing 19200bps. Those SCPs tied into the campus network. What this meant is that you could attach to one of the campus mainframes, UM or UB, and do stuff in a VT100 emulator. You could download from the Info-Mac archive to your mainframe account using FTP. You'd then have to use Kermit in your terminal emulator to actually get it down from the mainframe to a local machine. At some point, they deployed a short-lived standard called SLFP that let you do TCP/IP over phone lines (1200 or 2400 bps) or over the faster “direct” connections in the labs. I remember using a program called Macnet SLFP to download files from Info-Mac. We were amazed at the blazing speeds of the downloads — at 19.2k, even the huge programs of the day (300-500 kb) came down in just a few minutes.
The majority of the software I used to use in those days came via Info-Mac (go sumex-aim.stanford.edu!), so it has a warm place in my heart.
People have been wanting this ever since Google News launched, and now they’ve finally added the ability to subscribe to news search results, with no more need for hacks or screenscraping. Here’s a feed of news that contains the name of my hometown. Mom, if you click that link in Safari, you can follow the local news even from Florida. :)
As for complaining about one’s pet format not getting exclusive billing, stop that, it only makes you look like a ranting grump.
I had my newly 2.0 PSP with me all day today as we did our usual Saturday running around. One thing is that you can use the network setup config screen as a sort-of ghetto Wi-Fi sniffer. :) It’ll list any networks it detects, along with the SSID, signal strength (as a percentage, of what, I wonder), and the type of security (currently none, WEP, and WPA-TKIP are detected.) There were wireless networks nearly everywhere we stopped today, about half of them secured. 802.11 is truly ubiquitous nowadays.
Anyway, a truly portable web browsing device is really a big deal when teamed up with Wi-Fi everywhere. I spent some time sipping a lemonade at my favorite free Wi-Fi spot and casually browsing on the PSP, and it’s a much less awkward affair on a 10-ounce handheld than it is schlepping around an 8 pound laptop, unfolding it on the table, etc. Don’t talk to me about browsing on mobile phones, either — that’s an exercise in purest pain in comparison (the PSP’s bright, sharp, wide screen makes all the difference.)
I haven’t written anything about the “shipment” of Atom 1.0 because, well, I haven’t been writing anything, but, of course, I’m excited about it. A gent named Sam Pearson has been beavering away on supporting 1.0 in Blosxom. My current plans are to cut my 0.3 feed over as soon as I see a few more of the big aggregator vendors ship versions that support it.
Prompted by some discussions I've had this week, I've got a question for some of the less geeky of you out there. In this context, I'm referring to people who, generally, neither know nor particularly care about the "plumbing" of websites.
If you could answer in the comments, I'd really appreciate it -- this will make the site better, in the long run.
Do you currently use a news aggregator (standalone or through a portal, for example My Yahoo) to follow lots of sites without having to visit them in your web browser?
If you do use an aggregator, how did you find out about website syndication?
What do you think about the current process of finding / subscribing to a new syndicated feed? Do you look for 'XML' or 'RSS' icons, or specific icons for syndication services? Do you just let your browser or aggregator notifiy you when it detects a newsfeed?
Surely, the fastest way to generate a link from me is to bag on little, orange, stupid XML icons. I have absolutely no problem with proving myself completely predictable by linking Anne van Kesteren's rant on the topic.
In the past somebody made the wrong choice the whole web now has to pay for by following the wrong route.
Dare Obasanjo had a great idea -- list the top 5 sites you read that still need syndicated feeds. Here are mine (heavily slanted towards entertainment, as most of computer geeks have already gotten the clue):
Just logged into my GMail account (2061 MB and counting...) and noticed that the signon form is now branded as Google Accounts. Looks like the single-signon thing for Google will be happening sooner than later.
Leave the blogosphere for a minute and it goes to hell in a handbasket, I tell ya... I missed a lot of teapot tempests, but I trust my past biases and tendencies would probably have clued you in already as to where I'd fall on so many of these... er, burning questions. I'll try to plow through all this inside baseball bullshit in a single post so as not to befoul the actual interesting things I could be writing about with them.
Anyway, I'll just start tossing list bullets at Markdown and we'll see where it gets me.
...if we were older, then-- oops
Anyway, Google has led an initiative to take the steam out of various forms of online link pollution / abuse by proposing (and implementing, with the partcicipation of a lot of vendors) an extension to the venerable anchor tag:
rel="nofollow". I hope it works. Fast uptake and wide implementation of this really promises to drain the monetary incentive from blogspam / refer(r)er spam / trackback spam / guestbook spam / wiki spam. If the people doing this realize that their links won't be spidered, hopefully they just wont bother any more.
The problem, however, as Mark Pilgrim stated a while back, is that these jokers don't read weblogs, they just write to them. The time/financial economics of spamming are such that it's cheaper for a spammer to blindly spew on thousands of sites without error checking than it is for them to spend time individually checking their work to make sure that their scripts are working as intended.
My own weblog is a perfect example. Over the life of this site, I've basically had 3 spammers who have accounted for upwards of 99% of all the spam attempts. The earliest was a guy I refer to as "Unca Philtie", since his primary method was to produce hundreds of fake refer(r)erals from a set of shell Blogspot blogs which linked to various Paris "Horsey" Hilton pr0nsites. For months after I removed the refer(r)er display from this site, he continued to bombard me with several hundred requests a day. It was cheaper for him to keep bombarding me with requests than it was for him to check his logs.
The second was the "Greets from me" guy, a pr0n comment spammer, so named because he actually signs his spams that way (still active). Of the three, he is certainly the cleverest, as he's occasionally made adjustments to his scripts to adapt to my countermeasures. He only has access to a limited number of zombie hosts, so I've largely been able to keep him neutralized at the firewall. I usually get a flood from him about once a week or so, when he picks up a few more proxies.
The third is ol' Joe Incest, so named because all his incoming pr0n spams mention incest. They also mention a veritable cornucopia of other things, many of which I am certain are still impossible until we as a species evolve a few more protuberances and/or orifices, but I digress. I trust that the most straightforward way to defeat a spammer who always spams the same topics is self-evident. His script also has a pretty glaring bug (that I'm not going to help him fix by going into detail) that keeps any of his spams from ever even showing up here at all. Indeed, I expect that this bug would bite him on any Blosxom blog, so he must not be paying attention, which reinforces the point I made earlier on...
Anyway, it would be nice if this new initiative shows results, but we should certainly be prepared for the possiblility that these assholes may just leave their buggy, braindead, and above all fast-cheap-nasty scripts running forever. After all, how often do you bother to clean up your crontabs?
‘Twas midnight, and the UNIX hacks
Did gyre and gimble in their cave
All mimsy was the CS-VAX
And Cory raths outgrabe.
“Beware the software rot, my son!
The faults that bite, the jobs that thrash!
Beware the broken pipe, and shun
The frumious system crash!”