Note (12/2015): Hi there! I'm taking some time off here to focus on other projects for a bit. As of October 2016, those other projects include a science book series for kids titled Things That Make You Go Yuck! -- available at Barnes and Noble, Amazon and (hopefully) a bookstore near you!

Co-author Jenn Dlugos and I are also doing some extremely ridiculous things over at Drinkstorm Studios, including our award-winning webseries, Magicland.

There are also a full 100 posts right here in the archives, and feel free to drop me a line at secondhandscience@gmail.com with comments, suggestions or wacky cold fusion ideas. Cheers!

· Categories: Computers
What I’ve Learned:

Turing test: where men are men, except sometimes they're not.
“Turing test: where men are men, except sometimes they’re not.”

Like most computery sorts of things, Turing tests are only properly understood by a few pale geniuses who know how slide rules work and never have anything to do on Friday nights.

(Except to argue about the “proper” understanding of Turing tests. And catch up on Eureka reruns.)

But a Turing test basically boils down to one simple question:

Can a computer convince an “average interrogator” that it is human — and not, in fact, a computer?

The Turing test was first proposed by British mathematician and computer scientist Alan Turing in the 1950s. This was around the time when people first began to wonder whether machines could someday think on their own. Only nobody could precisely define what constituted “thinking”, exactly, and computers the size of post offices could scarcely rub two digits together, so the question went mostly nowhere.

That’s when Turing — now considered the father of theoretical computer science — posed his question, which was much easier to test. Whether the computer can “think” or not, can it fool people into believing it’s a live, thinking person? Turing tests come in a few flavors, but they mostly work like this:

An interrogator types questions to two test subjects — one of whom is flesh-and-blood human, while the other is motherboard-and-capacitor machine. Each answers via text — no cheating where one of the voices sounds like C-3PO or Bender — and after a few rounds, the interrogator decides which is the human. If he or she chooses the computer more than thirty percent of the time, the machine passes the Turing test.

And then Skynet probably gets built and melty-face Robert Patrick comes back in time to kill John Connor and all of human civilization will depend on Arnold Schwarzenegger saving our asses again, which seems way less likely now that he’s a member of the AARP.

Still, people perform Turing tests. Probably because they want to know when to start hoarding canned food and electromagnetic pulse bombs.

For decades, Turing tests were pretty non-apocalypse-portending. A gadget might do okay with questions limited to one subject, or when trying to pass itself off as a paranoid schizophrenic. But until recently, no computers had gotten especially close to passing a Turing test using what science would consider an average interrogator.

(In contrast to life outside science, where Siri and chatbots and Japanese girlfriend simulators have been talking people into giving away money and marriage proposals for years.

Clearly, the bar for “average interrogator” is just a leeeetle bit lower in some parts of the digital world.)

But just this week, a computer managed to pass a Turing test administered in London, fooling thirty-three percent of interrogators into believing it was a thirteen-year-old boy.

Which some might argue is only a small step up from a paranoid schizophrenic. Still — progress.

Of course, many in the artificial intelligence community don’t pay much attention to Turing tests. There are many ways to run one, and (see above) many ways in which people allow themselves to be fooled by relatively unsophisticated programs. Besides these difficulties, some computer scientists question the very relevance of Turing tests in the modern age. The goal of AI, after all, is to make machines more intelligent — not to make them more like humans.

I’ll leave it to the reader to decide how wide the chasm is between those two goals. Just remember — we created Two and a Half Men. So there is a chasm. Clearly.

Image sources: Computer Science Unplugged (Turing test cartoon), Futurama Point (angry Bender), What Culture (Robert Patrick/T-1000), OffTopics (The Termin-older)

· Write a comment
· Tags: , , , , , , , , , , , ,


· Categories: Biology
What I’ve Learned:

Mycoplasma: Small things, which probably also come with small packages.
“Mycoplasma: Small things, which probably also come with small packages.”

In every group, there’s one little guy trying to compensate for his small stature by acting tough and being a royal pain in the ass. In the bacterial world, that’s Mycoplasma.

Mycoplasma is a genus — that is, a group of species — of one-celled microscopic critters, just like all other bacteria. But there are a few important differences. For one thing: size. Even by bacterial standards, mycoplasmas are runty; in fact, they hold the record as the smallest living cells known to science.

It’s true. Most people think it’s Verne Troyer. But no. Mycoplasma.

Besides making them cocky and insufferable, mycoplasmas’ small size creates problems for researchers who study them. These bacteria can’t be seen, even under most microscopes, and wriggle through filters that trap their beefier bacterial cousins.

Another difference is the cell wall — a tough layer outside the cell that provides structure and support. Most bacteria (and fungi and algae and plants) have them; mycoplasma species don’t. That’s highly unusual for a single-celled creature. They’re oozing around barely decent, in nothing but their skimpy cell membranes. Mycoplasmas are the creepy exhibitionist neighbors of the bacterial neighborhood, who never put on pants and refuse to close the blinds.

Sadly, the mycoplasmas’ lack of shame causes more problems than the occasional unwanted peepshow. Many antibiotics — including penicillin — attack bacterial cell walls. Mycosplasmas laugh and shake their scantily-clad bacterial butts at these drugs. So if you catch a mycoplasma infection — they cause atypical pneumonia and pelvic inflammatory disease, among others — you won’t get rid of them with the usual antibiotic suspects. The little bastards actually are tougher than they look.

But mycoplasma species aren’t just compensating for puny size. Besides the cell wall, they have genomes smaller than most any other organism — so small, scientists were able to synthesize the whole thing from scratch and effectively create a “synthetic” bacterial cell using the DNA.

Also, they can’t make any of their own amino acids, and leech cholesterol from whatever they’ve infected to shore up their cell membranes. So in a sense, mycoplasmas are incomplete — dependent on hosts to survive, feeble and literally devolving. Scientists think the cell wall and genetic material were lost from Mycoplasma species after they split off from other bacteria.

Seriously, no wonder they’re a pain in the ass. I wouldn’t be surprised if mycoplasmas all had hair plugs and gold chains and drove teeny little BMWs.

Mycoplasmas are so screwy, they’re a big problem for biologists — even the biologists not trying to study them. Maybe especially the ones not trying to study them.

The issue is, mycoplasma contaminations are very hard to detect and require special tests. Much biological research depends on cell lines and tissue cultures — living cells extracted from humans or animals that can be grown in the lab and tested in various ways. But if mycoplasma is lurking unseen inside those cells, then any test results could be compromised. And it’s estimated that a third or more of the world’s cultures may be contaminated.

All that runty bacterial goop screws up the real science, and no one knows exactly what the results are until the cells are cleaned up and experiments repeated. This costs time, money and — worst of all — graduate student sanity. Those poor lackeys don’t have much to begin with, and mycoplasma infections can shred their last remaining wits. I’ve seen it. It’s tragic.

So beware of the Mycosplasma species. They’re tiny, overcompensating, and seemingly everywhere. They’ll make you sick, drive you crazy and get less evolved every day. In short, mycoplasmas are like an unseen horde of Rob Schneiders, storming into your favorite cells. The horror. Oh, the HORROR.

Actual Science:
MicrobeWiki / Kenyon CollegeMycoplasma
The ScientistOut, damned mycoplasma!
NatureResearchers start up cell with synthetic genome
Journal of the American Medical Association (JAMA)Newly discovered protein helps mycoplasma evade immune response
MIT Technology ReviewHuman genome contaminated with mycoplasma DNA

Image sources: Animal Architecture (synthetic mycoplasma), US Magazine (Verne Troyer), AutoWeb (small penis), Suitably Bored and SodaHead (Schneider horde)

· Write a comment
· Tags: , , , , , ,


· Categories: Biology
What I’ve Learned:

Australopithecus: In hominid evolution, you win some, you Lucy some.
“Australopithecus: In hominid evolution, you win some, you Lucy some.”

Australopithecus sounds like an oddball instrument some grimy dreadlocked dude from New Zealand would play — or possibly a disease you might catch from said grimy Kiwi dude, if you stood too close to him during the concert.

Happily, Australopithecus is actually a genus — that is, a group of species — that lived in southern and eastern Africa between about two and four million years ago. The term “Australopithecus” means “southern ape”, which is a reasonable description of these hairy prehistoric hominid furballs.

(Also, it’s probably not a bad description of that nappy New Zealander. I’m just saying.)

The Australopithecus species — including the famous “Lucy” specimen — are particularly interesting, because they’re the earliest fossil remains of ape-like creatures that weren’t particularly ape-like. Though they had relatively small brains, analyses of Australopithecus toes, feet and hips reveal that they could probably walk upright. They may have also played hopscotch and danced the lambada on steamy Saturday nights.

(Those last bits aren’t really supported by the science. I just like to picture them.)

Also, most Australopithecus species have powerful teeth and jaws not seen in apes, and some may have even used tools. So they were no longer true apes; they were more like apes crossed with crocodiles who could walk on two legs and work a can opener, but probably played a lousy game of chess because their brains were still the size of golf balls.

Or, basically a pack of slightly-hairier Rachael Rays. I apologize in advance for the nightmares that phrase will surely cause.

Thus, Australopithecus has answered one important question about human evolution: the adaptation to stand upright was not driven by the development of a big, human-like brain. Early hominids were walking (and cooking 30-minute meals, apparently) before they had the ginormous craniums we’re all so proud of.

But another question remains: were the Australopithecus species, the original bipedal hairballs, direct ancestors of modern-day humans? It depends on who you ask.

(Just don’t ask Rachael Ray. She’s kind of sensitive about it.)

On one hand, there’s good evidence that Australopithecus species walked upright, and they have several features — like particular bone shapes and relative sizes — intermediate between earlier species’ skeletons and those found in doctors’ offices and Halloween costume shops today. Some scientists say: if it mostly waddles like a prehistoric duck, and it vaguely quacks like a prehistoric duck, then slap a mammoth skin on it and call it Daffy. For them, chances are we have Australopithecuses in our family tree.

Other scientists aren’t so sure. “Upright” is one thing, but these pre-humans probably didn’t walk or run like we do, and likely still climbed trees and “knuckle-walked” on a regular basis. Also, there were still Australopithecus species hanging around after the Homo genus emerged about 2.4 million years ago. Did the Homo species, including Homo sapiens — i.e., every human on the planet, except possibly Carrot Top — evolve from one of the earlier Australopithecus groups? Or intermingle with a later one? Did we evolve from a common ancestor, then watch their branch die out over the millennia?

At this point, nobody knows for sure. More fossil specimens — like the new species Australopithecus sediba, described in 2010 — may someday provide conclusive evidence of a link between the genuses. That would be exciting news, indeed.

Except then we’d have to invite Rachael Ray to the family reunions. And nobody wants to be at that potluck, Junior. Pass the prehistoric potato salad.

Image sources: Discovering Cultures (Australopithecus), Go, See, Write (dreadlocked dude), ResearchMatters/ASU and Armchair Cook (Rachael Ape), After and Before (Make-it-stop, Carrot Top)

· Write a comment
· Tags: , , , , , , , , , , , ,


· Categories: Physics
What I’ve Learned:

Brownian motion: even stumbling home drunk means you're doing science.
“Brownian motion: even stumbling home drunk means you’re doing science.”

Brownian motion is one of those things in science that are easy to observe, but which take an awful lot of math and fancy calculators to explain what’s happening under the hood. Much like gravity, or rainbows, or why anyone on the planet still listens to Coldplay.

In simple terms, Brownian motion describes the movement of particles floating in a liquid or gas. This motion is caused by the molecules of that liquid or gas randomly bumping into the particles, jostling them in unpredictable directions. You’re probably familiar with this process, if you’ve ever watched dust dancing on a pool of water, or tried to get out of a crowded subway car when the doors open on the opposite side.

Scientists have observed this “random walk” movement of particles for centuries; there was even an ancient Roman philosopher, Lucretius, who described it back in 60 BC. But the rest of the Romans were apparently busy inventing candles or shades or wrestling Grecos or something, and nobody thought much about wiggling little particles for another eighteen hundred years.

The first person who got back to it was a Dutch guy named Jan Ingenhousz, who in 1785 described the movement of coal dust particles in alcohol. Because that was apparently the most interesting thing he could think of doing with alcohol in 18th century Europe. I’m sure he was an absolute riot at fancy dress balls.

Scientists agreed that Ingenhousz was onto something, but nobody wanted to put a tongue-twister like “Ingenhouszian motion” into the textbooks, probably, so his contribution was mostly swept under the rug, along with his coal dust. And his party invitations.

It wasn’t until 1827 when a more reasonably-named Scottish botanist, Robert Brown, came along and stared at tiny grains of pollen skittering in water — because evidently he couldn’t get a date on Friday night, either. But at least his name was easier to spell, and scientists have called it “Brownian motion” ever since.

To be fair, Brown didn’t actually explain what was happening. He just noticed particles lurching around like those drunken bastards staggering out of ballrooms at all hours of the night, while he was stuck alone in a laboratory, squinting into microscopes and questioning his life choices.

It was another few decades before people — including Albert Einstein, naturally, because what didn’t he do? — sorted out the math behind Brownian motion, which involves a bunch of Greek letters and constants and other stuff my Fisher-Price calculator isn’t equipped to deal with. All I know is, the solutions also supported the idea of unseen tiny atoms and molecules, which wasn’t a done deal at the time. So that was progress.

Besides the squiggly pollen grains and scary maths, what does understanding Brownian motion buy us? Actually, quite a lot. Those same models and equations have been applied to improving medical imaging, helping robots auto-navigate tricky terrain, optimizing schedules in manufacturing, explaining animal herding behavior, studying stock market fluctuations and developing solutions in nanotechnology.

In fact, about the only thing the study of Brownian motion hasn’t done is to get more scientists invited to fancy dress balls.

Random staggering or no, some things in science never change.

Image sources: ETSU / Bob Gardner (Brownian motion), JG Stevenson (crowded subway), This Old Toy (Cookielator), Hypable (physicist drinking alone, aka ‘sad Raj’)

· Write a comment
· Tags: , , , , , , , , , , ,


· Categories: Biology, Genetics
What I’ve Learned:

Synthetic genomics: when regular old DNA just won't do.
“Synthetic genomics: when regular old DNA just won’t do.”

Genomics is the study of and the fiddling with (science term) an organism’s DNA. So naturally, you might think that “synthetic genomics” is studying and fiddling with DNA while wearing polyester.

It’s not. From what I’ve seen of most biologists’ wardrobes, the polyester thing is pretty much implied in all of genomics. And most weekend parties.

Instead, synthetic genomics is a particular style of fiddling with DNA that uses components and rules that nature hasn’t gotten around to trying yet.

(Because nature tends to be very busy doing other things. And in her spare time, distracted by all the polyester.)

There’s a big difference between sciences like “genetic modification” or “genetic engineering” and synthetic genomics. All sorts of organisms’ genomes have now been modified in the lab — corn, for instance, and glow-in-the-dark fish, and possibly Jocelyn Wildenstein. But in these cases, the genes engineered into the DNA came from other species in nature, and followed the usual rules for how DNA works.

(Not how faces work, necessarily. But at the DNA level, it’s all textbook. And usually a difference of just one or two genes.)

But synthetic genomics is different. Here, the usual rules go out the window. Recently, a team of scientists used computers to redesign a chromosome found in yeast, synthesized the new sequence and plugged it back into real yeast cells.

Why? I’m not entirely sure. Maybe they’re trying to create glow-in-the-dark beer, or sandwich bread that talks to you while you eat it. Both of which I’m in favor of — preferably in the same meal. But in the meantime, it’s a monumental bioengineering achievement, and could produce more efficient yeast.

No doubt the Pillsbury dough boy and Budweiser Clydesdales are salivating over that.

Another team is trying to modify pig DNA to look more similar to humans. Which, of course, because a real-life Porky Pig is an obvious choice to be the next judge on The Voice. But actually, it’s so pig lungs can be yanked out and transplanted to save humans with terminal lung disease, while eliminating the problem of foreign tissue rejection.

Which just goes to show, everything is better with bacon. Including cross-species thoracic surgery. Mmmmmm, bacon.)

The biggest news to come out of synthetic genomics recently is that the DNA inside every living thing — from bacteria to badgers to Barbara Walters — can be expanded upon. Improved. Turned to eleven.

Announced just this week, scientists at the Scripps Research Institute whipped up a strain of E. coli bacteria that don’t just use the usual four basic building blocks of DNA, but instead use six.

This is big news. It’s like when Columbus turned Europeans on to a whole new continent, or Lewis and Clark followed Sacagawea through the Northwest, or the day you discovered you like actually dark beer. Whole new frontiers open up, full of possibility and hangovers and grizzly bear attacks.

And now, full of semi-synthetic genetically-fiddled-with E. coli., like little microscopic scientists wearing polyester pants. Sometimes the bacteria don’t fall far from the tree.

In practical terms, an expanded DNA alphabet could lead to revolutions in genetics, bioengineering and the ability to mass-produce useful proteins that have never before existed.

Whether that will finally improve biologists’ wardrobes is still up in the air. Science can only do so much.

Image sources: ChemistryWorld (6-base DNA), NBC New York (Jocelyn Wild-n-woolly-stein), Broadway World and Popscreen (Porky and the Voice), Unstable Molecules (unfashionable scientists [Phil and Lem, awww])

· Write a comment
· Tags: , , , , , , , , , , , ,