Multi-Agent Systems

What are Multi-Agent Systems?

In Artificial Intelligence research agent systems have been recognized as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are computer programs that act autonomously on behalf of their users, across open and distributed environments, to solve problems. However, applications require multiple agents that can work together. A Multi-Agent System (MAS) is a loosely coupled network of software agents that can work together and solve problems that are difficult or impossible for an individual agent or a monolithic system to slove.

simple_reflex_agent

Simple Reflex Agents


goal_based_agents.jpg

Model Based Reflex Agents


goal_based_agents (1)

GEnter a caption

 

Agents can be divided into:

  • Active Agents – Agents having certain goals
  • Passive Agents – Agents without goals
  • Cognitive Agents – Agents with complex goals or operations.

Now depending on the environment the agents are working in can be classified as:

  • Discrete Environment – A discrete environment has fixed locations or time intervals.
  • Continuous Environment – A continuous environment could be measured quantitatively to any level of precision.

 

Advantages of Multi-Agent Systems:

  • A MAS distributes computational resources and capabilities across a network of interconnected agents. Whereas a centralized system maybe plagued by resource limitations, performance bottlenecks or resource limitations.
  • A MAS allows for the interconnection and interpolation of multiple existing systems.
  • A MAS models problems in terms of autonomous intteracting component-agents which is proving to be a more natural way of representing task allocation, team planning, user preferences, open environments and so on.
  •  A MAS provides solutions in situtations where expertise is temporally and spatially distributed.

 

Applications:

  • Intelligent monitoring of Airline Operations.
  • Self-healing networks.
  • Milatry Logistics Planning.

 

 

Computer Vision

What is Computer Vision?

Humans use their vision to see things and then they interpret those via their brain. Similarly, computer vision is to make computers perceive, process and understand visual data such  as images and videos. The ultimate goal of computer vision is to model, replicate, and more importantly exceed human vision using computer software and hardware at different levels. It needs knowledge in computer science, electrical engineering, mathematics, physiology, biology, and cognitive science.

 

 

62020_wl_compvision_fig1_wl.jpg

Object Detection and Tracking

 

 

The Computer Vision Hierarchy:

  • Low-level vision: Processing the image for extracting the features such as edges, corners.
  • Middle-level vision: Object recognition, motion analysis, and 3-D reconstruction using the features obtained from low-level vision.
  • High-level vision: Interpretation of the information from middle level. Interpretation may include a conceptual description of a scene like activity, intention, and behavior.

Computer vision is, in some ways, the inverse of computer graphics. While computer graphics produces image data from 3D models, computer vision often produces 3D models from image data. Computer vision includes 3D analysis from 2D images. This analyzes the 3D scene projected onto one or several images, e.g., how to reconstruct structure or other information about the 3D scene from one or several images. Computer vision often relies on more or less complex assumptions about the scene depicted in an image. (via  wiki)

houghp

Typical tasks of computer vision:

  1. Recognition
  2. Motion Analysis
  3. Scene Construction
  4. Image Restoration

The applications of computer vision are numerous and include:

  • Augmented Reality
  • Autonomous Vehicles
  • Biometrics
  • Character Recognition
  • Forensics
  • Face Recognition
  • Gesture Analysis
  • Medical Image Analysis
  • Process Control
  • Remote Sensing
  • Robotics
  • Security and Surveillance
  • Transport

 

Elastic Computing

Auto Scaling:

Elastic computing is a concept in cloud computing in which computing resources can easily be scaled up and down by the cloud service provider. In short the provision of flexible computing power as and when required. In reality, cloud elasticity only applies to e-commerce; mobile and web development; and SaaS, as well as any other software development companies.

amazone

Amazon Elastic Cloud Compute (EC2) – Virtual Server Hosting

Load Balance:

Elasticity aims at providing a service with the actual amount of resources it requires. Nor under-provisioning neither over-provisioning. Allocation of more than required resources should be avoided as the service provider has to pay for the resources allocated. Under-provisioning should be avoided as it does not provide good service to its customers.

Monitoring :

Elastic application continuously allocates and deallocates computer resources according to their use, making cloud resources more volatile and traditional monitoring tools for monitoring the usage.

The Amazon EC2 is a web service that provides resizable compute capacity in the cloud. EC2 is a virtual computing environment, that enables customers to use Web service interfaces to launch instances with a variety of operating systems, load them with your custom applications, manage your network’s access permissions, and run your image using as many or few systems as you need.

 

Elastic_infrastructure_sketch

An Elastic Infrastructure provides preconfigured virtual server images, storage and network connectivity that may be provisioned by customers using a self-service interface. Monitoring information is provided to inform about resource utilization required for traceable billing and automation of management tasks.

Data Mining

What is data mining?

Data mining is a process used by companies to turn raw data into useful information. By using software to look for patterns in large batches of data, businesses can learn more about their customers and develop more effective marketing strategies as well as increase sales and decrease costs. Data mining depends on effective data collection and warehousing as well as computer processing.

196662_537b_2

The phrase data mining is commonly misused to describe software that presents data in new ways. True data mining software doesn’t just change the presentation, but actually discovers previously unknown relationships among the data.

Data mining is popular in the science and mathematical fields but also is utilized increasingly by marketers trying to distill useful consumer data from Web sites.

Data Warehouse:

The drop in the price of data storage has given companies willing to make the investment a tremendous resource: Data about their customers and potential customers stored in “Data Warehouses.” Data warehouses are becoming part of the technology. Data warehouses are used to consolidate data located in disparate databases. A data warehouse stores large quantities of data by specific categories so it can be more easily retrieved, interpreted, and sorted by users. Warehouses enable executives and managers to work with vast stores of transactional or other data to respond faster to markets and make more informed business decisions. It has been predicted that every business will have a data warehouse within ten years. But merely storing data in a data warehouse does a company little good. Companies will want to learn more about that data to improve knowledge of customers and markets. The company benefits when meaningful trends and patterns are extracted from the data.

158962_9314_2

Data Mining Technologies

The analytical techniques used in data mining are often well-known mathematical algorithms and techniques. What is new is the application of those techniques to general business problems made possible by the increased availability of data and inexpensive storage and processing power. Also, the use of graphical interfaces has led to tools becoming available that business experts can easily use.

Some of the tools used for data mining are:

Artificial neural networks – Non-linear predictive models that learn through training and resemble biological neural networks in structure.

Decision trees – Tree-shaped structures that represent sets of decisions. These decisions generate rules for the classification of a dataset.

Rule induction – The extraction of useful if-then rules from data based on statistical significance.

Genetic algorithms – Optimization techniques based on the concepts of genetic combination, mutation, and natural selection.

Nearest neighbor – A classification technique that classifies each record based on the records most similar to it in a historical database.

Applications of Data Mining:

Data mining has a lot of applications out of which some are mentioned below

  • Analysis
  • Fraud Detection
  • Intrusion Detection
  • Financial Banking
  • Research Analysis
  • Bio Informatics

 

Quantum Computing

So what’s Quantum Computing?

Quantum computing is an area based on building a computer or developing computer technologies based on the principles of quantum theories, which explains the nature and behaviour of energy and matter on the atomic and subatomic level. Quantum computers if made in practical would boost computer performance a billion times than today’s supercomputer. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. They work together as quantum bits or qubits.

us__en__cai__quantum_computing_impact_infographic__1500x2143.png

Source: IBM

Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, might represent a combination of the two or might represent a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once. A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers.

Think of a qubit as an electron in a magnetic field. The electron’s spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron’s spin from one state to another is achieved by using a pulse of energy, such as from a laser – let’s say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and completely isolate the particle from all external influences? According to quantum law, the particle then enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilised could take a superposition of both 0 and 1. Thus, the number of computations that a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number – 2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing – classical computers today, even so-called parallel processors, still only truly do one thing at a time: there are just two or more of them doing it). But how will these particles interact with each other? They would do so via quantum entanglement.

Infographic_medium.2016-02-19-12-44-00.jpg

 

By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel.

 

Hyperloop is coming

 So everyone’s heard about Hyperloop. It’s a conceptual high-speed transport that works on reduced-pressure tubes in which pressurised capsules ride on air (wiki). Sounds straight out of a science-fiction, doesn’t it? Well yeah, it used to cause it’s happening now.
the-hyperloop-fancy-commute-at-800-mph-19-638

Hyperloop termed as the “fifth mode” of transportation by Elon Musk, the CEO of Tesla Motors and SpaceX. Recently Dirk Ahlborn, CEO of Hyperloop Transportation Technologies(HTT) reached an agreement with Slovakia for building the first Hyperloop there, one route possibly could connect the capital, Bratislava, with Vienna and Budapest with possible speeds up to 760mph .According to Wired, the estimated cost for the Hyperloop at Slovakia between $200m and $300m and be completed by 2020.

 

hyperloop-100591080-primary.idge

A prototype of Hyperloop

Competitions are also held by Musk through SpaceX as a sign of an encouraging hand for designing a prototype passenger pod for the Hyperloop.

14347134544285_the-hyperloop-fancy-commute-at-800-mph-20-638

High energy efficiency coupled with electric propulsion yield an energy elegant, carbon-free mode of transportation. And to enable on-demand transport, Hyperloop pods are much smaller than most planes and trains and are designed to depart as often as every 10 seconds.

the-hyperloop-fancy-commute-at-800-mph-24-638

In California, HTT’s rival Hyperloop Technologies is currently in the process of building a three-mile test track, which they hope to complete by the end of the year. Rob Lloyd, chief executive officer of Hyperloop Technologies, told CNN that their first commercial train could be ready as soon as 2021 or even 2020.