- History of #DataScience (Infographic) wp.me/p1AMkl-Gb gPressed 1 hour ago
- 10 Most-Funded #BigData Startups January 2015 wp.me/p1AMkl-Gg gPressed 1 hour ago
- 3 Database Admins Walked into a NoSQL Bar... wp.me/p1AMkl-GD gPressed 1 hour ago
- RT @InstallCore_: New @PwC Survey: CEOs Embrace #Digital Transformation buff.ly/1AXajXO via @GilPress #Mobile #data #IoT http://t.co… gPressed 1 hour ago
- RT @tordizuin: "@InstallCore_: History of #DataScience (Infographic) via @oceansofdata @GilPress: buff.ly/1Gv5IOu http://t.co/IcDTb… gPressed 1 hour ago
2014 was a great year for crowdfunding. Kickstarter had 22,252 projects raising a total of $529 million, up from $480 million raised in 2013. Indiegogo had 1,000% increase in funds raised over the past two years and both Indiegogo and Kickstarter had their most-funded projects ever in 2014.
This year is already shaping up as the greatest ever for the young industry, with three 2015 projects already making it to the list of 15 most-funded crowdfunding projects on Kicstarter and Indeigogo:
1. $13,285,226 from 62,642 funders August 2014 (Kickstarter)
Coolest Cooler: 21st Century cooler that’s actually cooler, complete with built-in ice crushing blender, a waterproof bluetooth speaker and a USB charger.
2. $10,266,845 from 68,929 funders May 2012 (Kickstarter)
Pebble: E-paper watch for iPhone and Android, customizable watch with downloadable watchfaces, sports and fitness apps, notifications from mobile phone.
3. $8,782,571 from 219,382 funders February 2015 (Kickstarter)
Exploding Kittens: A card game for people who are into kittens and explosions and laser beams and sometimes goats.
4. $8,596,474 from 63,416 funders August 2012 (Kickstarter)
OUYA: A new kind of video game console, cracking open the last closed platform, the TV; a beautiful, affordable console, built on Android.
5. $6,225,354 from 18,220 funders April 2014 (Kickstarter)
Pono Music: Where your soul rediscovers music, providing the best possible listening experience of your favorite digital music. With the PonoPlayer, you can finally feel the master in all its glory, in its native resolution, CD quality or higher, the way the artist made it, exactly.
6. $5,702,153 from 91,585 funders April 2013 (Kickstarter)
The Veronica Mars Movie Project, a feature film version of the defunct television series.
7. $5,408,916 from 105,857 funders July 2014 (Kickstarter)
Bring Reading Rainbow Back for Every Child, Everywhere! Bring Reading Rainbow’s library of interactive books & video field trips to more platforms & provide free access to classrooms in need.
8. $5,022,041 (including $2.5 million matched contributions) from 2,801 funders December 2014 (indiegogo)
Code.org, introducing coding to 100 million students.
9. $4,188,927 from 74,405 funders April 2013 (Kickstarter)
Torment: Tides of Numenera, a story-driven Computer Role-Playing Game (CRPG) set in the world of Monte Cook’s Numenera.
10. $3,986,929 from 73,986 funders October 2012 (Kickstarter)
Project Eternity, an isometric, party-based computer RPG set in a new fantasy world.
11. $3,845,170 from 67,226 funders October 2013 (Kickstarter)
Mighty No. 9, a video game
12. $3,602,037 from 12,075 funders January 2015 (Kickstarter)
ZANO, the world’s most sophisticated nano drone.
Sondors Electric Bike, the world’s most affordable, versatile electric bike.
14. $3,429,235 from 17,744 funders August 2012 (Kickstarter)
Reaper Miniatures Bone: Gaming miniatures.
15. $3,401,361 from 11,855 funders May 2014 (Kickstarter)
The Micro: The first truly consumer 3D printer.
[Originally published on Forbes.com]
“When AI (Artificial Insemination) meets AI (Artificial Intelligence)”
“It all starts not with data and analytics but with an idea”
These are the top ten most-funded startups in the big data space:
Cloudera $1040 Million Hadoop-based software, services and training
Palantir Technologies $950 Analytics applications
MongoDB $311 Document-oriented database
Domo $250 Cloud-based Business intelligence
Mu Sigma $195 Data-Science-as-a-Service
DataStax $190 Apache Cassandra-based platform
MapR $174 Hadoop-based software, services and training
Opera Solutions $122.2 Data-Science-as-a-Service
Guavus $107 Operations intelligence platform
[Tie] Adaptive Insights $101.3 Cloud-based Business Intelligence
[Tie] GoodData $101.2 Cloud-based Business Intelligence
This list is based on my own research, augmenting the very helpful but sometimes unreliable, CrunchBase database. The funding includes VC contributions as well as investments by established companies.
[Originally published on Forbes.com]
At a public event last month, Edward Snowden argued that the NSA has developed a “culture of impunity,” that its people “are not villains, but they think they can do anything because it is for a just cause.” John DeLong, an NSA Director, responded that “the idea that NSA activities were unauthorized is wrong, it’s wrong in a magnificent way.”
The two came as close as possible to a live debate at the “Privacy in a Networked World” symposium at Harvard University’s Institute for Applied Computational Science where Snowden had a wide-ranging discussion via a video link with security expert Bruce Schneier. John DeLong (Harvard Law, former Director of Compliance at the NSA, current Director of the NSA’s Commercial Solutions Center), immediately followed with his talk about privacy, insisting he did not want to turn it into a “point-by-point, Oxford-style debate.”
Indeed, the specific details about this surveillance program or that particular law are far less important than the answer to a single question: In defending your country, do you do the right thing or do you do things right? Do you do what’s morally right or do you just follow the rules (or make sure it looks like you follow the rules)?
I think Snowden won the “debate” hands-down because I much prefer his view of the good people of the NSA. Snowden sees them as thinking humans, aware of the values of the country they are trying to defend, and capable of making difficult decisions and weighing all the ramifications of their actions. They went wrong because they got carried away by their mission. DeLong, in contrast, views NSA employees (himself included) as no different from the machines they work with, capable only of following unquestionably rules that are handed down to them by others, rules that can be (and should be) codified into machine language. They haven’t done anything wrong, because they have never broken the law.
As you can see in the video below, most of the Snowden-Schneier discussion revolved around the familiar themes we’ve seen since the 2013 revelations: The solution to governments intruding into our lives is technology or the encryption of all communications; mass surveillance has exploded because it’s cheap and easy but it has never stopped a terrorist attack; the technical advantage the NSA used to have over the bad guys has been all but eliminated; the NSA has shifted its focus and a much larger proportion its effort is in offence, not defense.
But about 35 minutes into the conversation, Schneier brought up what I think is the crux of the matter by pointing out the distinction between “Are we following the rules?” and “Are these the right rules”? He suggested that “the way you get this greater oversight is these discussions of what makes sense, what is moral in our society, what is proper.”
Unfortunately, Snowden—in this case—didn’t take the bait and responded by pointing out the financial cost of the NSA’s actions in terms of the damage to the business of American high-tech corporations. Showing the photo of the NSA tapping into a Cisco router, he said: “this has a real cost, not just legally, not just morally, not just ethically, but financially.”
In his Wired interview with James Bamford, however, Snowden was quite eloquent about the moral and ethical costs (referencing Arendt’s Eichmann in Jerusalem: A Report on the Banality of Evil) and how a “culture of impunity” has developed at the NSA and other parts of our government:
“It’s like the boiling frog… You get exposed to a little bit of evil, a little bit of rule-breaking, a little bit of dishonesty, a little bit of deceptiveness, a little bit of disservice to the public interest, and you can brush it off, you can come to justify it. But if you do that, it creates a slippery slope that just increases over time, and by the time you’ve been in 15 years, 20 years, 25 years, you’ve seen it all and it doesn’t shock you… [Clapper] saw deceiving the American people as what he does, as his job, as something completely ordinary. And he was right that he wouldn’t be punished for it, because he was revealed as having lied under oath and he didn’t even get a slap on the wrist for it. It says a lot about the system and a lot about our leaders.”
Eichmann said in his defense that he was just following orders. Rudolf Hoess, the commander of Auschwitz, where more than a million people were murdered, said in his autobiography: “The reasons behind the extermination program seemed to me right. I did not reflect on it at the time: I had been given an order, and I had to carry it out.”
In his passionate defense of the NSA, John DeLong brought up the testimony of Professor Geoffrey Stone who served on the President’s Review Group that assessed the actions of the NSA after Snowden’s revelations, quoting from Stone’s Huffington Post blog: “Not only did I find that the NSA had helped to thwart numerous terrorist plots against the United States and its allies in the years since 9/11, but I also found that it is an organization that operates with a high degree of integrity and a deep commitment to the rule of law. The Review Group found no evidence that the NSA had knowingly or intentionally engaged in unlawful or unauthorized activity.”
But there is more (and more to the point) in the same post:
This is not to say that the NSA should have had all of the authorities it was given. The Review Group found that many of the programs undertaken by the NSA were highly problematic and much in need of reform. But the responsibility for directing the NSA to carry out those programs rests not with the NSA, but with the Executive Branch, the Congress, and the Foreign Intelligence Surveillance Court, which authorized those programs — sometimes without sufficient attention to the dangers they posed to privacy and civil liberties. The NSA did its job — it implemented the authorities it was given.
Of course, “I was only following orders” is not always an excuse. But in no instance was the NSA implementing a program that was so clearly illegal or unconstitutional that it would have been justified in refusing to perform the functions assigned to it by Congress, the President, and the Judiciary. Although the Review Group found that many of those programs need serious re-examination and reform, none of them was so clearly unlawful that it would have been appropriate for the NSA to refuse to fulfill its responsibilities.
So following orders “is not an excuse” and the NSA followed orders that were “highly problematic” but the responsibility lies elsewhere because the NSA people are simple automatons? And if “many of those programs need serious re-examination and reform,” why does Professor Stone, instead of thanking Snowden for being the catalyst for this much needed re-examination, think that he “deserves punishment”?
Similarly, DeLong said that in regards to Snowden, “we need to let the wheels of justice turn in his case.” And he talked about the President’s directive of January 2014 and updates on this presidential policy next month, all of course coming after Snowden’s revelations. “When the NSA says we are going to be more transparent in this place, you need to hold us more accountable,” DeLong exhorted the audience. Would he talk today about the need to move transparency “up a level” if not for Snowden?
DeLong, who served as Director of Compliance at the NSA from 2009 to late 2014, realized in that job that “to operate consistently under the rule of law, rules must be translated into technical requirements.” He went further to say that if you are the policy or legal guy writing these rules, “try to imagine that you are writing a recipe that will be cooked to scale, by many cooks, some cooks being humans, some cooks being machines.”
DeLong (who is also a Certified Compliance and Ethics Professional) is twice removed (two hops?) from the ethics of the decisions (e.g., hack into the Cisco router or not?). His first defensive wall is murky laws, no matter that they are subject to conflicting interpretations or were written for an analog world. His second moat is code. “If you think of the rules as essentially system and functional requirements,” he said, “what look like legalese, words on the page, might start to look like object-oriented code, complete with if-thens, go-tos, some spaghetti code now and then, some cut and paste, some patterns, some anti-patterns.”
I’m not making this up. DeLong also asserted that “the law is really the math and science of human interaction” and that “perhaps quantum entanglement is the physics of sharing human ideas.” He is keen on “discussions,” on “lawyers talking to engineers,” on pulling “together people from different parts of society on equal grounds.” He thinks that protecting privacy and civil liberties today is “more art than science,” but that “the science of privacy,” allowing “society to work with data,” is going to be seen as “one of the great engineering, math, science, legal, policy feats of our time.” And he knows how to achieve it: “To understand each other’s language and to translate difficult law and policy decisions into repeatable and understandable computations, takes some time, patience, and lots of conversations. So I would sternly urge everyone to join the effort. The self-reinforcing circles that might in the short-term make us more comfortable, in the long term don’t really advance–moving us forward–in the art and science of privacy.”
Did DeLong refer to the membership of the presidential Review Group mentioned above when he talked about “self-reinforcing circles”? Are “computations” really the solution to what are essentially moral dilemmas?
We have here a serious and probably sincere person defending in such a way the actions of the NSA because Big Talk and Big Data have taken over. Big Talk is what Orwell called “Newspeak,” the language invented by the totalitarian state in 1984. Have you noticed that a certain “Duckspeak”—“just because we can do something, doesn’t mean we should do it”—has been repeated again and again by government officials since 2013?
They do it—collecting all the data they can collect—regardless of whether they should or shouldn’t (see the Wall Street Journal article on “U.S. Spies on Millions of Cars”), not only because, as Snowden said, it’s cheap and easy. They do it because they believe in the ideology of Big Data, a Google invention which has been adopted and amplified by many other data-collecting entities. “That building a giant haystack is the way to go, and that you don’t need even to know what needle you are looking for, it will simply ‘emerge’ from the data, is certainly what the NSA learned from big data advocates,” I wrote when the first Snowden revelations came out in June 2013. Now Mattathias Schwartz writes in “The Whole Haystack” (reviewing the one and only case, by NSA’s admission, where bulk phone-records collection stopped a terrorist activity—sending a few thousand dollars to Somalia): “By flooding the system with false positives, big-data approaches to counterterrorism might actually make it harder to identify real terrorists before they act.”
It’s not efficient, it’s not effective, it may even do more harm than good. But above of all, the ideology of big data turns us into machines, making decisions based on code, substituting rules and laws for ethics and values.
The lure of big data and the temptation to collect it all is everywhere you turn. At the symposium, I overheard a Harvard Computer Science professor compliment an undergraduate student on her wise choice of concentration, bio-informatics, telling her that a large pharmaceutical company told him “they have so much data, there are probably 3 or 4 drugs hidden there.” This blind belief in the automated discovery powers of lots and lots of data is fine when all that is at stake is a corporation’s profits. It is, however, a critical endurance test for the government of the people, by the people, for the people.
[Originally published on Forbes.com]