A. I. Projects

artificial intelligence lab projects (AI HUB)Projects

  • A. I. Hub Machine Learning Methods & Algorithms Debategraph – A. I. Hub is inviting you to improve this Debategraph – visualization of Machine Learning methods and algorithms.  Your constructive input is highly welcome. Please leave comments or contact us for request as a contributor to this Machine Learning & Algorithm Map. So, join the community!

  • BrainHubSM  – Harnessing the technology that helps the world explore brain and behavior.What is happening in the brains of people with autism or neurodegenerative diseases? How do we get our brain to learn new information, or to even heal itself? Carnegie Mellon University knows that the answers to these, and other, critical brain science questions lie at a pivotal intersection between biology, neuroscience, psychology, computer science, statistics and engineering – areas where CMU excels. And the world has taken notice of CMU’s excellence, putting the university at the hub of unique global partnerships focused on brain research.

  • 2045 Initiative / “Project Immortality” – The 2045 Initiative is a nonprofit organization with the goal of creating a network community with the world’s leading scientists in the field of life extension by means of the cybernetic technologies and to support them as an investment hub, contributing to various projects.
  • Synthetic Neurobiology Group – Your brain mediates everything that you sense, feel, think, and do. A challenge for humanity is to understand the brain at a level of abstraction that enables the engineering of its function — so that it becomes possible to understand how the brain computes, and also to treat intractable brain disorders. We are inventing new tools for analyzing and engineering brain circuits. For example, we have devised, often working in interdisciplinary collaborations, ‘optogenetic’ tools, which enable the activation and silencing of neural circuit elements with light, 3-D microfabricated neural interfaces that enable control and readout of neural activity, and robotic methods for automatically recording intracellular neural activity and performing high-throughput single-cell analyses in the living brain. We distribute tools as freely as possible, and are using our inventions to enable systematic approaches to neuroscience, revealing how neurons work together in circuits to generate behavior, and empowering new therapeutic strategies for neurological and psychiatric disorders.
  • Project Joshua Blue – According to researchers at IBM’s Thomas J. Watson Research Center, the main goal of Joshua Blue is “to achieve cognitive flexibility that approaches human functioning”. In short, IBM is aiming to design Joshua Blue to ‘think like a human’, mainly in terms of emotional thought; similar IBM projects focusing on logical thought and strategic reasoning include Deep Blue, a logic-based chess playing computer, and Watson, a question-driven artificial intelligence software program. Currently, the vast majority of computers and computational systems run off of an input-output model; some sort of input is entered in and some output is given back. Through Project Joshua Blue, IBM hopes to develop computers to the point where they are asking questions and searching for answers themselves rather than relying on an external input to run or only crunching numbers to give a pre-programmed response once given a task. If they succeed in this task, the artificial intelligence knowledge gained from Project Joshua Blue could potentially be used to create social robots that work and act very much like humans do. These robots could take over tasks too dangerous for humans to engage in even if such tasks required many different decisions to be made along the way; the technology advancement gained through Joshua Blue’s potential success would allow for the robots to think for themselves and work their way through problems just as humans do.
  • The MindForth Project —Arthur T. Murray – Implement spreading activation as AI Minds in Forth and JavaScript. The MindForth Project is coded in Forth to think with concepts based on a linguistic theory of mind. The AI uses an inference module for automated reasoning. MindForth has branched out into the Wotan Project to think in German and into the Dushka Project to think in Russian.
  • Russell Wallacee – Turn programs into procedural knowledge, via logical reasoning about code, guided by heuristics both hand coded and automatically learned.
  • Matt Mahoney – Language model evaluation and cost estimation by text compression.
  • Matt Mahoney – AGI = lots of narrow specialists + a distributed index for routing messages to the right experts + economic incentives to be useful in a decentralized, hostile market.
  • Will Pearson – Designing an architecture to allow experimental code creation to not interfere with other parts of the system, while allowing the parts of the system to change in purposeful fashion. -Note not a full AGI approach but a prerequisite project.
  • YKY (Yan King Yin) – higher-order logic + fuzzy-probabilistic calculus + inductive learning.
  • Joseph Henry  – An architecture based on replicating human cognitive abilities through direct engineering of self-modifying discrete task modules, held together via a highly general knowledge representation language.
  • Vicarious – Vicarious is developing machine learning software based on the computational principles of the human brain. Our first technology is a visual perception system that interprets the contents of photographs and videos in a manner similar to humans. Powering this technology is a new computational paradigm we call the Recursive Cortical Network™.
  • The Blue Brain Project – Reconstructing the brain piece by piece and building a virtual brain in a supercomputer is the main goal of the Blue Brain Project, which will neuroscientists a new understanding of the brain and a better understanding of neurological diseases.
  • Quantum Artificial Intelligence Lab – is a collaboration between Google, NASA Ames Research Center and USRA. We’re studying the application of quantum optimization to difficult problems in Artificial Intelligence. “We believe quantum computing may help solve some of the most challenging computer science problems, particularly in machine learning. Machine learning is all about building better models of the world to make more accurate predictions. If we want to cure diseases, we need better models of how they develop. If we want to create effective environmental policies, we need better models of what’s happening to our climate. And if we want to build a more useful search engine, we need to better understand spoken questions and what’s on the web so you get the best answer.”
  • DataKind™ – brings together leading data scientists with high impact social organizations through a comprehensive, collaborative approach that leads to shared insights, greater understanding, and positive action through data in the service of humanity.We believe that improving the quality of, access to, and understanding of data in the social sector will lead to better decision-making and greater social impact. To do this, we offer the following services:

    DataDives – are weekend events that bring the data science community together with the non-profit community to tackle tough data problems in a just a short period of time. They promote learning about new skills and methods in data science, bring talented data scientists to the table with social organizations and knowledge experts, and yield truly amazing results from just a weekend. DataCorps – We are in the midst of a data revolution. Nearly every interaction we have with each other and our environment is now digitized and stored, creating vast opportunities to better understand, react to, and shape our world. DataKind’s DataCorps is an elite group of data scientists dedicated to using data in the service of humanity. The DataCorps teams with social organizations like governments, foundations, or NGOs, for 3-6 month collaborations to clean, analyze, visualize, and otherwise make use of data to make the world a better place. DataChapters – At DataKind, we pledge to be a force for positive change in this new data landscape, bringing data services, education, and support to those tackling the world’s toughest social problems. However, we can’t do it alone. DataKind Chapters spread our pledge across the globe, helping people everywhere use data for the greater good.

  • BigDog (Boston Dynamics) -BigDog runs at 4 mph, climbs slopes up to 35 degrees, walks across rubble, climbs muddy hiking trails, walks in snow and water, and carries 340 lb load. Development of the original BigDog robot was funded by DARPA. Work to add a manipulator and do dynamic manipulation was funded by the Army Research Laboratory’s RCTA program. For a paper that summarizes the BigDog program, click here, or for overview slides click here.
  • Project Halo – A long-term artificial intelligence vision to create a “Digital Aristotle” – a computer that contains large amounts of knowledge in machine-computable form that can answer questions, explain those answers, and discuss those answers with users.
  • The SYMBRION and REPLICATOR projects The main focus of these projects is to investigate and develop novel principles of adaptation and evolution for symbiotic multi-robot organisms based on bio-inspired approaches and modern computing paradigms. Such robot organisms consist of super-large-scale swarms of robots, which can dock with each other and symbiotically share energy and computational resources within a single artificial-life-form.
  • Artificial Intelligence Group at JPL – Performs basic research in the areas of Artificial Intelligence Planning and Scheduling, with applications to science analysis, spacecraft commanding, deep space network operations, and space transportation systems.
  • Cyc – An artificial intelligence project that attempts to assemble a comprehensive ontology and knowledge base of everyday common sense knowledge, with the goal of enabling AI applications to perform human-like reasoning.
  • OpenCog – An open-source software initiative aimed at directly confronting the challenge of creating beneficial artificial general intelligence (AGI), with broad capabilities at the human level and ultimately beyond. An integrative architecture designed to embody synergies between multiple learning algorithms and representations. Current work focuses on controlling a learning agent in a virtual world, with robotics work on the horizon.
  • A.I.L.E.E.N.N. (Artificial Intelligence Logical Electronic Emulation Neural Network) – A cloud-based PaaS and IaaS for A.I. represented as a ubiquitous entity based on Neural Networks and Fuzzy Logic with universal inputs, outputs and actuators aiming to the democratization and human-like interaction as the ultimate resource planner, decision making process and actuator.
  • Ai Research – An international project with a research center near Tel Aviv founded to create true artificial intelligence making it possible for humans and computers to speak to each other in everyday language.
  • The SyNAPSE Project – In many ways computers today are nothing more than very fast number-crunchers and information manipulators. They can process lots of data, but they really don’t think. They all adhere to the Von Neumann architecture, largely unchanged in the last half-century, in which computers are constructed by separating memory and processing and operate by executing a series of pre-written “if X then do Y” equations. With the advent of Big Data, which grows larger, faster and more diverse by the day, this type of computing model is inadequate to process and make sense of the volumes of information that people and organizations need to deal with.In searching for an answer, IBM researchers found inspiration for a new computer chip design from the most powerful, efficient information processing device in the world: the human brain. The cognitive capabilities of the brain includes understanding the surrounding environment, dealing with ambiguity, acting in real time and within context – all while consuming less power than a light bulb and occupying less space than a two-liter bottle of soda.In August 2011, as part of the SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) project IBM researchers led by Dharmendra S. Modha successfully demonstrated a building block of a novel brain-inspired chip architecture based on a scalable, interconnected, configurable network of “neurosynaptic cores” that brought memory, processors and communication into close proximity. These new silicon, neurosynaptic chips allow for computing systems that emulate the brain’s computing efficiency, size and power usage.
  • A Map of Cognitive System Research in Europe – EUCognition.org has compiled and geolocated most of the academic and industry partners in Cognitive Systems FP7 projects in this map. We’ve included labs, research groups, university departments and companies. Please let us now who we are missing so we can represent the community accurately.

    Although care has been taken to ensure the accuracy and completeness of the information displayed in this map there may be inadvertent factual inaccuracies or omissions. Please send us an email if you see one (social@eucognition.org)

    Sources: • Cordis Website (http://cordis.europa.eu/), • FP7 projects websites, • Institution Websites


Projects By Apache Software Foundation’s (ASF’s)

  • Apache Flink
    Flink operates with a fault-tolerant, “exactly once” streaming data engine at its core, and treats batch operations as a special case of streaming. It offers libraries for machine learning and graph processing and is compatible with Apache Kafka and HBase, amongst other Hadoop components. Flink may sound a lot like Spark, but its fault tolerant, streaming-first paradigm makes it a little different. Meanwhile, the two engines have a great deal of overlap in the scenarios they address, and some competition between the two may ensue. Grab the popcorn and keep an eye on this one.
  • Apache Samza
    The Samza project is focused on streaming data processing. That space may seem to be already well-served by Storm, Spark and Flink (now that you know about it), but Samza has a few tricks up its sleeve. It works with Kafka and YARN out of the box but, according to its Web site, offers a pluggable architecture that enables integration with other messaging and execution engines. Let’s be clear about something: streaming data platforms are enjoying a hype cycle of their own at present. As is the nature of hype, this occurrence of it is disproportionate to the amount of streaming data work being done out there. But it’s still important and will likely have the effect of making stream processing more, uh, mainstream. The combination of Kafka and Samza is one with some cachet, as both were developed at LinkedIn. Code used in production before its open source project launches, especially at a big social media company with pressing Big Data concerns, naturally commands authority and attracts attention. But whether that’s enough to overcome the popularity of Spark Streaming and the broad support (especially form Hortonworks) for Storm remains to be seen.
  • Ibis (Cloudera-incubated)
    I’ve written about Ibis before. It’s a Cloudera-incubated project, geared to data scientists, that aims to bring the Python programming language into the world of distributed applications. Much as Revolution Analytics (now owned by Microsoft) has done for the R programming language, the Ibis team is working out a way for Python code to execute across nodes in a cluster instead of on a single workstation or server. Interestingly, Ibis achieves this distributed capability by piggy-backing on Impala, a massively parallel processing (MPP) SQL-on-Hadoop project, also incubated at Cloudera. The project team does, however, aim to make Ibis’ coupling with Impala a loose one, so that it could run on other distributed platforms as well. Given Python’s popularity (alongside R) in machine learning and predictive analytics, and given the importance of distributed computing to both of those pursuits, Ibis’ uptake is worth monitoring.
  • Apache Twill (incubating)
    The Twill project provides an abstraction layer over YARN, Hadoop’s clustering and resource manager. YARN is the component that decouples Hadoop from the MapReduce algorithm, permitting it to run while also allowing other processing engines — including Spark and Flink — to take its place. In doing so, YARN effectively turns Hadoop into a more general distributed computing platform. That, of course, has a lot of value. The problem, however, is that YARN is complex, and has a steep learning curve. Twill’s abstraction layer aims to make YARN development accessible to mainstream Java developers. Its team of 10 committers is lead by Arun Murthy, Hortonworks’ Founder and Architect, and the driving force behind YARN and Tez. That indicates dedication to making Twill effectively part of the YARN offering, and that’s pretty exciting.
  • Apache Mahout-Samsara
    Mahout is a machine learning engine that is neither new nor obscure in the land of Hadoop ecosystem projects. But I cover it here because it has gone through a major revamp with its 0.10.0 release in April, when a new mathematics environment called Samsara was added. Notably, Samsara runs on Apache Spark, not merely optionally but as a hard dependency. This changes Mahout from being a MapReduce abstraction layer, and thus inheriting all of Hadoop MapReduce’s overhead, to a more modern responsive scalable machine learning library. According to the project’s Web site, MapReduce-based versions of Mahout’s machine learning algorithms will continue to be supported, but no Hadoop implementations of new algorithms will be accepted for inclusion in the project. This creates some interesting inter-project competitive scenarios. First off, it puts Mahout itself into competition with Spark’s own MLlib component. Second, it adds fuel to the competition between Spark and Hadoop itself.
  • Hadoop
    If this small sampling of some of the many Big Data open source projects out there shows anything, it’s that Hadoop isn’t merely like a city, but rather a major metropolitan area. It has its suburbs, where its mayor has no jurisdiction, and where political beliefs may differ from those in the center of town. But it has its core character and it must be treated as a market in its own right. Practitioners have to approach “greater” Hadoop, not just the core project itself, or they risk missing trends in its adoption and evolution.

No Comments

No comments yet. You should be kind and add one!

Leave a Reply

You can use these tags:   <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>