Choosing an effective password that’s easy to remember and type, as well as hard to guess for would-be fraudsters, is a perennial problem. But it’s one that the folks at Microsoft Research are trying to tackle with an experimental tool called Telepathwords.
Armed with an arsenal of data on common passwords and password-setting habits, the team built a tool that detects how vulnerable your password is by trying to guess the next letter as you type it.
You can visit the project site for yourself and see how predictable your own passwords are. For example, if you think a clever password would be p@$$w0rd, think again – the tool guesses it right instantly. If your password is zxserisljeerouiaer2345, on the other hand, its telepathic propensity flounders.
via The Next Web.
IBM didn’t have to flaunt its debatable cloud dominance over Amazon Web Services on the sides of public buses if it wanted to upstage the cloud kingpin at its user conference this week — Big Blue could have just led with the news that its famous, Jeopardy!-champ-destroying Watson system is now available as a cloud service.
That’s right: Developers who want to incorporate Watson’s ability to understand natural language and provide answers need only have their applications make a REST API call to IBM’s new Watson Developers Cloud. “It doesn’t require that you understand anything about machine learning other than the need to provide training data,” Rob High, IBM’s CTO for Watson, said in a recent interview about the new platform.
More on the the details later, but first the big picture. If IBM actually delivers a workable cloud platform around Watson and developers actually take advantage of it to build new, smart applications, it will be a big fricking deal.
Google no longer understands how its “deep learning” decision-making computer systems have made themselves so good at recognizing things in photos.
This means the internet giant may need fewer experts in future as it can instead rely on its semi-autonomous, semi-smart machines to solve problems all on their own.
The claims were made at the Machine Learning Conference in San Francisco on Friday by Google software engineer Quoc V. Le in a talk in which he outlined some of the ways the content-slurper is putting “deep learning” systems to work.
“Deep learning” involves large clusters of computers ingesting and automatically classifying data, such as things in pictures. Google uses the technology for services such as Android’s voice-controlled search, image recognition, and Google translate.
via The Register.