As Robert Fortner explained in Rest in Peas: The Unrecognized Death of Speech Recognition, after all these years, we’re desperately far away from any sort of universal speech recognition that’s useful or practical.
Now, we do have to clarify that we’re talking about universal recognition: saying anything to a computer, and having it reliably convert that into a valid, accurate text representation. When you constrain the voice input to a more limited vocabulary — say, just numbers, or only the names that happen to be in your telephone’s address book — it’s not unreasonable to expect a high level of accuracy. I tend to think of this as "voice control" rather than "voice recognition".
Still, I think we’re avoiding the real question: is voice control, even hypothetically perfect voice control, more effective than the lower tech alternatives? In my experience, speech is one of the least effective, inefficient forms of communicating with other human beings. By that, I mean …
- typical spoken communication tends to be off-the-cuff and ad-hoc. Unless you’re extremely disciplined, on average you will be unclear, rambling, and excessively verbose.
- people tend to hear about half of what you say at any given time. If you’re lucky.
- spoken communication puts a highly disproportionate burden on the listener. Compare the time it takes to process a voicemail versus the time it takes to read an email.
I am by no means against talking with my fellow human beings. I have a very deep respect for those rare few who are great communicators in the challenging medium of conversational speech. Though we’ve all been trained literally from birth how to use our voices to communicate, voice communication remains filled with pitfalls and misunderstandings. Even in the best of conditions.
Click Here To Read Full Article [Written by Jeff Atwood]
Today, alongside Community Pages and a revamped Interests section, Facebook is also making some changes to its privacy controls. Facebook and confusing privacy settings have long gone hand in hand, which really isn’t a surprise given the huge amount of connections and data sharing that the site keeps track of. To Facebook’s credit, the site continuously iterates on its Privacy control panel to try to make it easier to use. Today though, Facebook has launched a new privacy section that may leave users scratching their heads.
It’s called “Friends, Tags, and Connections.” In short, this section is about the data on Facebook that you can’t actually control. You can make it harder to find, and even hide it from your profile, but you can’t remove it entirely.
Here’s how Facebook explains this:
Friends, Tags and Connections covers information and content that’s shared between you and others on Facebook. This includes relationships (shared between you and the person you’re in the relationship with), interests, and photos you’re tagged in. These settings let you control who sees this information on your actual profile. However, it may still be visible in other places unless you remove it from your profile itself.
via Facebook Launches New Privacy Section That May Make Your Head Hurt [TechCrunch.com]
In the current landscape of technology and accessing the Internet through devices such as picture frames, netbooks, cell phones and televisions, the benefits of Web standards outweigh those of Flash, especially when delivering content to a broad audience on various devices.
Flash is a proprietary product that sits on top of the browser to extend functionality. While Flash may have provided missing functionality for some time, it brings little value to modern browsers. As more and more designers and developers realize the benefits of Web standards and start using some of the features of HTML5 and CSS3, we’ll see fewer Flash-driven websites.
Click Here To Read Full Article [via SmashingMagazine.com]
Written by Greg Jarboe
The Official Google Blog has made it official: "We’re including a new signal in our search ranking algorithms: site speed."
"While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page," said Singhal and Cutts. "Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven’t seen much change to your site rankings, then this site speed change possibly did not impact your site," they added.
Click Here To Read Full Article [via SearchEngineWatch.com]
Hundreds of WordPress blogs, particularly those hosted by Network Solutions, have been hit with an attack that cripples the blogs and redirects visitors to a URL that loads malware. The attack has been reported by bothSucuri Security Labs and Trend Micro. It works by replacing the contents of a WordPress blog’s "siteurl" field (under wp_options) with some HTML code. That field isn’t supposed to contain HTML, so it effectively breaks the blog.
Security companies haven’t figured out how the blogs were exploited, although Sucuri says it was probably SQL injection or a database problem at Network Solutions. Network Solutions is investigating, and looking to blame a WordPress theme or plugin for the security hole, Trend Micro says. Trend Micro also has some info on the malware that the blogs are now redirecting to: it’s a known malware family called BUZUS, and antivirus software should be able to identify it.
If your blog was affected, change your siteurl back to its old value. You can find it under manage database, in the wp_option table.
Written by Tom Foremski
Ever since I first heard about the Internet and then saw its incredible development and application across industries, I’ve been on the look out for the economic effects of this powerful platform technology. The specific economic influence I’ve been looking for is a strong deflationary trend. That’s when we will know when the Internet has truly begun to reach its potential.
Let me explain why.
The Internet is a communications technology that also carries its own computer processing technology. We can deliver computing power to any connected computer platform, pocket or desktop based or otherwise, wired or wireless. We can see this quite clearly today in the Internet based technologies of AJAX and beyond, which offer browser and non-browser based applications delivered over the Internet.
Distributed computing power that is communicated in a two-way medium across the Internet. Take a look at the cloud computing based technologies where applications can be dynamically provisioned across scalable information architectures. While a lot of these technologies still have to go through an adoption process, which is partly cultural, we aren’t too far off from a world where powerful applications can be delivered anywhere, anytime, for little more than the cost of electricity.
Click Here To Read Full Article [via SiliconValleyWatcher.com]
“When I hear about a great idea that a friend has, I get excited. I can’t wait to see that idea become reality.
Then I ask about the idea a few months later, and it often is not one bit closer to completion.
Ideas stop short of becoming reality, and projects seem to drag on endlessly, because of one thing: complexity.” -Leo Babauta
1. Keep the scope as simple as possible. You don’t need to do everything with this project. In fact, if you can just do one thing, that’s perfect. As small a thing as possible. Don’t redesign an entire city — just work on one building. If the project starts to get complex or seem overwhelming, narrow the scope. Do less. It’ll help you get things done.
2. Practice ‘Good Enough’. Perfectionism is the enemy of completion. Nitpick and worry about getting it “just right”, and you’ll never get it done. Done is better than right. So if you start to nitpick and worry about perfect, say “screw it” and then just try for “good enough”. You can always make it better in the next version.
3. Kill extra features. Similar to simplifying the scope, you’ll want to try to make your creation do as little as possible. Want it to talk and walk and cook breakfast? Just try for talking. Want your website to publish great content and have social networking and podcasts and news and a newsletter and a membership area? Just shoot for great content. Whenever you find yourself adding new features, see if they can’t be killed.
4. Make it public, quick. Your goal should be to get your project in some working form out to your customers/readers/public as soon as possible. In as few steps, as quickly, as easily, as simply as possible. Remember: don’t worry about perfect, and don’t let this first public release be wide in scope or full of features. Release it with as few features as possible. Releasing it publicly will 1) get you to done faster and 2) put some pressure on you to make it better, quickly.
Click Here to Read Full Article [via ZenHabits.net]
By Arthur Max, Associated Press Writer
AMSTERDAM – Dirk Hannema was known as a brilliant art curator but a bit of a fool. He claimed he had seven Vermeers in his collection, several Van Goghs and a few Rembrandts, but no one believed him.
Now 25 years after his death it turned out he was right — about one work by Vincent van Gogh.
The painting, "Le Blute-Fin Mill," goes on public display Wednesday in the small Museum de Fundatie in the central Dutch town of Zwolle.
Louis van Tilborgh, curator of research at the Van Gogh Museum in Amsterdam, said the painting was unusual for the 19th century impressionist, depicting large human figures in a landscape. The painting shows Parisians climbing wooden stairs to a windmill in the Montmartre district.
But the work was typical of Van Gogh’s at that time in other ways, with its bright colors lathered roughly on the canvas. Van Tilborgh said it was painted in 1886 when the artist was living in Paris. The canvas bore the stamp of an art store he was known to frequent, and used pigments that were common in other works, van Tilborgh said.
Click Here to Read Full Article [via Yahoo News via AP.org]
Streaming video websites like YouTube face growing pressure from consumers to provide support for native standards-based Web video playback. The HTML5 video element provides the necessary functionality to build robust Web media players without having to depend on proprietary plugins, but the browser vendors have not been able to build a consensus around a video codec.
Although the h264 codec has gained dominance due to its excellent compression and broad support in the consumer electronics ecosystem, it is covered by patents that preclude broad royalty-free usage. Several browser vendors, including Opera and Mozilla, favor the Ogg Theora media codec, which is believed to be unencumbered by patents. Ogg may offer advantages from a licensing standpoint, but there are still many unanswered questions about its quality and suitability for Internet video streaming services.
Streaming media consultant Jan Ozer conducted a hands-on comparison of Ogg and h264 in order to shed some light on the relative difference in encoding quality and performance. He has published the results of his comparison, including screen captures and sample clips, in a report at the Streaming Learning Center.
In the videos and still images that he provides for comparison purposes, the h264 content has better color quality and higher detail than the Ogg Theora content. Comparing 468 kbps clips, one can detect a very noticeable difference in quality between the two codecs. Even the 1mbps Ogg Theora clips are not on par with the 468 kbps h264 clips. Based on the results, Ozer concludes that h264 will have the upper hand in many Internet streaming scenarios.
Image credit: Streaming Learning Center; click image for full-size
"These tests are very aggressive, but purposefully so—at very high data rates, all codecs look good. In particular, YouTube encodes their H.264 video at 2mbps, about 2.5X higher than my tests. So my conclusion isn’t that Ogg is a bad codec; it’s that producers seeking the optimal balance between data rate and quality will find H.264 superior," he wrote.
Click Here to Read the Full Article [via ArsTechnica.com]