Curry Chandler

Curry Chandler is a writer, researcher, and independent scholar working in the field of communication and media studies. His writing on media theory and policy has been published in the popular press as well as academic journals. Curry approaches the study of communication from a distinctly critical perspective, and with a commitment to addressing inequality in power relations. The scope of his research activity includes media ecology, political economy, and the critique of ideology.

Curry is a graduate student in the Communication Department at the University of Pittsburgh, having previously earned degrees from Pepperdine University and the University of Central Florida.

Filtering by Tag: kurzweil

Guns with Google Glass, city of driverless cars, Kurzweil on hybrid thinking

 
  • Tech companies and weapons manufacturers are exploring the crossover potential for firearms and wearable technology devices like Google Glass. Brian Anderson at Motherboard reported Austin tech startup TrackingPoint's foray into this inevitable extension of augmented reality applications and posted the company's concept video:

"When paired with wearable technology, PGFs can provide unprecedented benefits to shooters, such as the ability to shoot around corners, from behind low walls, and from other positions that provide exceptional cover," according to a TrackingPoint press release. "Without PGF technology, such positions would be extremely difficult, if not impossible, to fire from."

The steadied rise of wearable technology is unlocking a dizzying number of potential killer apps. Indeed, If there was any lingering doubt that wearable tech is coming to the battlefield, the Glassification of a high-profile smart weapon should put any uncertainties to rest.

If being able to track and drop a moving target with single-shot accuracy at 1,500 feet using a long-range robo rifle wasn't sobering enough already, to think basically anyone can now do so over a hill, perhaps overlooking a so-called "networked battlefield" shot through with data-driven soldiers, is sure to be even more so.

The simulation is run by a proprietary software, and programmers will code in dangerous situations—traffic jams and potential collisions—so engineers can anticipate problems and, ideally, solve for them before the automated autos hit the streets. It's laying the groundwork for the real-world system planned for 2021 in Ann Arbor.

There will surely be some technical barriers to work out, but the biggest hurdles self-driving cars will have to clear are likely regulatory, legal, and political. Will driverless cars be subsidized like public transit? If autonomous cars eliminate crashes, will insurance companies start tanking? Will the data-driven technology be a privacy invasion?

Today you can buy a top-of-the-line S-Class car from Mercedes-Benz that figuratively says “ahem” when you begin to stray out of your lane or tailgate. If you do nothing, it’ll turn the wheel slightly or lightly apply the brakes. And if you’re still intent on crashing, it will take command. In 5 years, cars will be quicker to intervene; in 20, they won’t need your advice; and in 30, they won’t take it.

Accident rates will plummet, parking problems will vanish, streets will narrow, cities will bulk up, and commuting by automobile will become a mere extension of sleep, work, and recreation. With no steering column and no need for a crush zone in front of the passenger compartment (after all, there aren’t going to be any crashes), car design will run wild: Collapsibility! Stackability! Even interchangeability, because when a car can come when called, pick up a second or third passenger for a fee, and park itself, even the need to own the thing will dwindle.

Two hundred million years ago, our mammal ancestors developed a new brain feature: the neocortex. This stamp-sized piece of tissue (wrapped around a brain the size of a walnut) is the key to what humanity has become. Now, futurist Ray Kurzweil suggests, we should get ready for the next big leap in brain power, as we tap into the computing power in the cloud.

The headband picks up four channels from seven EEG sensors, five across the forehead and two conductive rubber ear sensors. Together, the sensors detect the five basic types of brain waves, and, unlike conventional sensors, they don’t need to be surrounded by gel to work. Software helps filter out the noise and syncs the signal, via Bluetooth, to a companion app. The app shows the user the brainwave information and offers stress-reduction exercises.

A bit further down the road of possibilities is brain-to-brain networking. Last year, researchers at the University of Washington used EEG sensors to detect one person’s intention to move his arm and used it to stimulate the other person’s brain with an external coil and watched as the second person moved his hand without planning to.

Powered by Squarespace. Background image of New Songdo by Curry Chandler.