Private equity/in the (AI) segment in the country grew over fivefold in 2018-19, to $359 million (nearly Rs 25,000 crore) from $63 mn (Rs 440 crore) in FY18.

Led by one of $140 million in September 2018 by Mithril Capital, Blume Ventures, Tiger Global and others into  According to data from research firm Venture Intelligence, there were 38 such deals in FY19, compared to 25 the previous year. In 2016-17, investment into AI totalled $50 million via 19 deals; the year before, $51 million in 13 deals.

Founded in 2011 by Samay Kohli and Akash Gupta, GreyOrange is a global leader in AI-powered robotics systems, for flexible across fulfillment centers in the supply chain. Other top deals during the year included Wavemaker Partners, Denso Corporate Ventures, Vertex and others investing $65 million in semi conductor firm ThinCI, in September 2018. Jungle Ventures, SoftBank Corp and others put $30 million in enterprise software firm in November 2018. Rocketship.VC, Sequoia Capital India and others invested $20 million in enterprise software firm Mad Street Den in September 2018. SBI Venture Capital invested $15 million into robotic process firm AntWorks in July 2018.


The tremendous growth in AI interest from investors is expected to grow. They note the potential it offers in the market, in the retailing, e-commerce, health care, travel, banking and logistics sectors, among others. Experts say big are also tapping into the technology to prepare for the future.

AI has, according to an Accenture analysis in 2017, the potential to add $957 billion or 15 per cent of the then gross value added, to India’s economy in 2035, compared to a scenario without AI. However, to avoid missing this opportunity, policy makers and business leaders must prepare for, and work toward, an AI revolution, it said. It recommends an innovative private sector, with a supportive policy and regulatory framework.










The Union government’s Budget for 2018-19, mandated the NITI Aayog to establish a national programme on AI, aimed to guide research and development in new and emerging technologies. Last June, the Aayog issued a report on a national strategy for the sector.

In the interim Budget announced this February, it was announced that the government would launch a national programme. This was to be catalysed by the establishment of a National Centre on AI as a hub, along with Centres of Excellence. Nine priority areas were identified and plans to develop a National AI portal announced.

NITI Aayog says AI could address societal needs in five sectors – for affordable access to quality health care, enhanced farmers' income and increased farm productivity, improved access and quality of education, connectivity for the growing urban population, and smart mobility and transportation. It also identified lack of expertiese in research and application of AI and the absence of enabling data eco-systems such as access to intelligent data, privacy and security.

Apart from funding, a major challenge is the availability of adequate skiled personnel, say experts from the education sector. The All India Council for Technical Education has added AI, the Internet of Things and Machine Learning among mandatory studies in some of the higher education programmes. Increased demand for products and services can attract more investment towards research and development, says iNurture Educatiion Solutions, a Bengaluru-based company working on enabling career-ready formal higher education.

Source: Business Standard News

When Google DeepMind's AlphaGo shockingly defeated legendary Go player Lee Sedol in 2016, the terms artificial intelligence (AI), machine learning and deep learning were propelled into the technological mainstream.


AI is generally defined as the capacity for a computer or machine to exhibit or simulate intelligent behaviour such as Tesla's self-driving car and Apple's digital assistant Siri. It is a thriving field and the focus of much research and investment. Machine learning is the ability of an AI system to extract information from raw data and learn to make predictions from new data.

Deep learning combines  with . It is concerned with algorithms inspired by the structure and function of the brain called . Deep learning has received much attention lately both in the consumer world and throughout the medical community.

Interest in deep learning surged with the success of AlexNet, a neural network designed by Alex Krizhevsky that won the 2012 ImageNet Large Scale Visual Recognition Challenge, an annual image classification competition.

Another relatively recent advancement is the use of graphical processing units (GPUs) to power deep learning algorithms. GPUs excel at computations (multiplications and additions) needed for deep learning applications, thereby lowering application processing time.

In our lab at the University of Saskatchewan we are doing interesting deep learning research related to healthcare applications —and as a professor of electrical and computer engineering, I lead the research team. When it comes to , using AI or machine learning to make diagnoses is new, and there has been exciting and promising progress.

BBC Newsnight: AlphaGo and the future of Artificial Intelligence.

Extracting blood vessels in the eye


Detecting abnormal retinal blood vessels is useful for diagnosing diabetes and heart disease. In order to provide reliable and meaningful medical interpretations, the retinal vessel must be extracted from a retinal image for reliable and meaningful interpretations. Although manual segmentation is possible, it is a complex, time-consuming and tedious task which requires advanced professional skills.

My research team has developed a system that can segment retinal  simply by reading a raw retinal image. It is a computer-aided diagnosis system that reduces the work required by eye-care specialists and ophthalmologists, and processes images 10 times faster, while retaining .

Detecting lung cancer

Computer tomography (CT) is widely used for lung cancer diagnosis. However, because  of benign (non-cancerous) and malignant (cancerous) lesions in CT scans are similar, a CT scan cannot always provide a reliable diagnosis. This is true even for a thoracic radiologist with many years of experience. The rapid growth of CT scan analysis has generated a pressing need for advanced computational tools to assist radiologists with the screening progress.

To improve radiologists' diagnostic performance, we have proposed a deep learning solution. Based on our research findings, our solution outperforms experienced radiologists. Moreover, using a deep learning-based solution improves diagnostic performance overall and radiologists with less experience benefit from the system the most.

Faster, more accurate diagnoses: Healthcare applications of AI research
A screenshot of the lung cancer detection software. Credit: Seokbum Ko, Author provided

Limitations and challenges

Although great promise has been shown with deep learning algorithms in a variety of tasks across radiology and medicine, these systems are far from perfect. Obtaining high-quality annotated datasets will remain a challenge for deep learning training. Most computer vision research is based on natural images, but for healthcare applications, we need large annotated medical image datasets.

Another challenge from a clinical standpoint will be the time to test how well deep learning techniques perform in contrast to human radiologists.

There needs to be more collaboration between physicians and machine learning scientists. The high degree of complexity of human physiology will also be a challenge for machine learning techniques.

Another challenge is the requirements to validate a deep learning system for clinical implementation, which would likely require multi-institutional collaboration and large datasets. Finally, an efficient hardware platform is required to ensure fast processing of  systems.

In the complex world of healthcare, AI tools can support human practitioners to provide faster service and more accurate diagnoses, and analyze data to identify trends or genetic information that may predispose someone to a particular disease. When saving minutes can mean saving lives, AI and machine learning may be transformative for healthcare workers and patients.

Source: PHYS.ORG

It’s not uncommon for new technology to spawn a new sport: Drone racing has become hugely popular around the world, and Segway Polo is apparently still a thing. It’s now artificial intelligence’s turn to give us a new pastime, but will the sport it invented, Speedgate, take the world by storm, or will it be relegated to this ad agency’s company picnic?

AKQA, a creative agency that works with brands like Nike and Beats, helped create the new sport for Design Week Portland that took place last week. As with many AI-assisted creations, it started by feeding a neural network a list of information about 400 existing sports, including how they’re played, how they’re won, and what rules apply. But unlike neural networks doing automatic face replacements, which rely on databases of thousands of source images, 400 is a limited sample size, and the AI’s suggestions required some human guidance and refinement.

TechCrunch spoke to Whitney Jenkins, AKQA’s creative director, who revealed that some of the AI’s proposed sports included a game played with exploding Frisbees (I’d watch that) and one involving hot air balloons and tightropes. Definitely new and unique, and definitely implausible. Humans helped the neural network refine its over 1,000 suggestions, and ultimately three games were physically tested before AKQA chose Speedgate and finalized its rules.


The game combines elements of several existing sports including Rugby, Soccer, and Handball, and can be customized to suit large or small fields. Players pass a ball (a size four Rugby training ball is currently used until the official Speedgate ball hits the market) to teammates by tossing, kicking, or punting it. The field is marked by three large rings encircling gates, and the ultimate goal is to get the ball through the opposing team’s defended gates which sit at either end. But to officially gain possession of the ball, it first has to be kicked through a gate located at the center of the field. It sounds like there’s a lot to keep track of, and players won’t have a lot of time to stop and think given they only have three seconds to either pass the ball or try to score.

The full rules and regulations for Speedgate are available on its flashy website, and AKQA appears to be putting a lot of effort into promoting its new sport right out of the gate—which might help it catch on. But let’s not pretend this is an altogether altruistic move on the agency’s part. The accelerated rate that artificial intelligence has been evolving is staggering (and a little terrifying) and as a result, experimenting with AI is a surefire way to drum up some free publicity. Sports are big business, and while Speedgate doesn’t require a pricey drone or expensive Segway to play, should it catch on, there’ll be no shortage of ways to capitalize on its popularity.


In Collaboration with HuntertechGlobal

It’s a sunny day in Austin, Texas and after a modest run around Lady Bird Lake I cool off by walking down Congress Avenue. I stop on the Congress Avenue Bridge, gaze at the water and admire the people rowing, paddle boarding and canoeing on the lake. Then, I realize I have a decision to make; am I going to walk towards the historic Texas State Capitol to enjoy its architecture, or am I going to walk towards the restaurants and shops for some indulgence?

Imagine if my decision was influenced by my mobile device or my smartwatch anticipating my next steps, by evaluating my past behavior, on a Saturday in downtown Austin after a run at Lady Bird Lake. What if previous data captured indicates that I like to shop at the little boutiques on Congress, or that after I’m cooled down, I like to reward myself with some tasty Tex-Mex at Gueros? I wonder what level of access companies have to my buying habits, my location, and more importantly, how they use this data/intelligence to influence my decisions as a consumer.

The reality is that systems are collecting data on our individual behaviors every single day, the big question is what is done with that data? Is that data being leveraged to influence a corporation’s strategy to be more effective in creating value for their customers? Is it being used to manipulate consumers? In an interview with Tim Cook, Apple CEO, at the 40th International Conference of Data Protection and Privacy Commissioners, he suggested that consumers are being surveilled, and our very own data is being used against us with military-like force. To me that is very concerning, so I asked Michelle Beistle, Chief Privacy Officer at Unisys for her perspective about the notion that consumers are being surveilled. While she is equally concerned about what that implies, she is optimistic and believes that some data collection is good to help companies better serve their customers as long as it is collected and treated ethically and respectfully. 


If you consider the ethics of the human side of being surveilled, imagine you are eating lunch at your favorite restaurant and a server observes you are low on water. They anticipate your need and refill your glass with more water, that’s not an issue, right? Admit it, as a consumer, it is nice when your needs are anticipated to allow for better service or a better customer experience. However, my daughter believes that iced tea should not be refilled until it is gone because it messes up the sweetener balance in the drink. I guess everybody has their own preferences, which is what makes predicting the needs of a consumer so difficult. To further complicate matters, Jim Stikeleather, a business professor at the University of South Florida and former colleague of mine, suggests that humans are not always rational decision makers. The bottom line is customers want customized services, and inaccurate predictions can be problematic. 

Improved customer service can result by enabling consumers and corporations to make faster and smarter decisions, which can be made possible by leveraging one of the top technology trends today, Artificial Intelligence (AI). According to technology philosopher, Sharad Gandhi, AI applied is when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem-solving." A caveat with AI is that we want to employ it to improve customer service through helpful suggestions or actions that don’t make the customer feel like their privacy has been compromised or their decision-making process has been manipulated.

As a consumer, I think business leaders need to take a balanced approach to how they get to know their customers, and I am a big believer that the voice of the customer is critically important in building the products and services that create value for the customer. Robert Marshall, co-author of the upcoming book The Applied Innovation: The Field Guide, recommends that companies should evaluate the incorporation of AI into their products and services, and that it is critical to analyze the business environment first to determine whether there is a specific business opportunity and associated outcome that justifies leveraging an innovative AI technology. He further states that through the lessons learned in observing successful AI implementations, key triggers need to be created that will impact organization action. For example, are government regulations working through the system that impacts the privacy and accessibility of data needed for the use case you are considering? In future articles, I will spotlight successful AI implementations where this approach can be put to use.

My story today is based on my perspective on AI as a consumer, and as a veteran in the technology industry who has seen my fair share of successful and unsuccessful implementations of new technologies. In the end, I decided to indulge myself in some shopping and a Tex-Mex treat, and honestly, my decisions were not influenced by my mobile device or smartwatch. But if I were on a run in Washington, D.C. instead of Austin, it might be valuable to me for AI technologies to leverage my patterns of behavior to provide me with recommendations on sites to see, places to shop, or a place to get a great meal. It could definitely help me make quicker and smarter decisions, which is great as long as the data is captured and treated in an ethical and respectful way.  

Source: Forbes

In 2018 there was a drastic change in tools and applications based on artificial intelligence, those changes not only affected the internet industry but also other industries like healthcare, legal, agriculture, and manufacturing industries. AI is playing a crucial role in the development process of the world’s top companies like Google, Facebook, Amazon etc.

AI is Converting our daily business operations in such a way that we have never seen before in any prior industrial revolution. It is optimizing and automating every business process means recommends a customer to buy product-2 to the customer who bought product-1. With this in mind we have to concentrate on the “Trends” which are added advantages to the current technology

Majority of the industries have already started investing in AI and are able to reap the maximum out of it.  AI is transforming the day to day life of the common man along with the business world. Now let’s see the addons in 2019 that has made the AI functioning better compared to 2018. 

Facial Recognition

Facial Recognition is a form of artificial intelligence application that identifies people’s faces using shapes, features of the face and also using the digital image.

In the initial days, this facial recognition concepts has shown some negative impacts on every industry but in 2019 we are going to see this feature with higher reliability and accuracy. we already aware facebook deep-face program which will easily tag our family and friends with face recognition and iPhone already using face recognition feature to unlock the phone.

Taking the AI facial recognition push forward the KFC worked by Yum China Holdings Inc., one of China’s greatest inexpensive food chains are using robotic arms to serve the ice creams cones and use facial recognition technology to place orders and make payments.

Artificial Neural networks

Artificial Neural network works similar to human neurons. The human brain consists of  100 billion of neurons to function the human brain. Artificial neurons are developed to make a computer or robot or machine to act like a human being.  These neurons are injected with a vast amount of data to learn from. They take data as input to deliver the on demanded services.

There is an incredible demand for neural network and are used for composing music,to improve order fulfillment, and diagnosing medical problems. The current neural network technologies have improved a lot in 2019 and it would continue to raise the research that will improve its effectiveness. 

Deep Learning

AI, the most well-known type of AI calculations, winds up testing when the quantity of measurements of information increments. For instance, ascertaining the cost of a home, given existing home costs in an area has just two components of information. Envision endeavoring to interpret your voice into content. The issue is currently exacerbated a hundred times.

Profound learning is additionally the innovation behind self-driving vehicles and voice control. With the arrival of Amazon’s Alexa and Google home, there is a wide scope of voice-empowered applications that utilizes Natural language processing (NLP) calculations, a case of profound learning.

Quantum Computing

Quantum Computers are using quantum physics for calculations and it will give the results as accurate as the supercomputers. Quantum computers could solve complex tasks that are beyond the capabilities of conventional computers.

2019 would see more research on quantum PCs and how to make methodologies to decrease the mistake rates to make important calculations conceivable. As per Andrew Childs, Co-chief, Joint Center for Quantum Information and Computer Science (QuICS), “Current mistake rates altogether limit the lengths of calculations that can be performed, We’ll need to complete much better on the off chance that we need to accomplish something fascinating.”


There are tremendous changes in the trends of Chatbots, if you are surfing website and need any help Chatbots will act as a humans but it’s actually operated by robots,for example, if a user wants to contact to the customer support chatbots will give directions on it’s own to help that particular person to connect with the customer support executive and will give response to the customer FAQ’s

According to the research, 25% of companies will use this chatbot or virtual assistant in their customer support by the end of 2020 and predicts that these chatbots would save an estimated $8 billion revenue annually by 2022. 


In Collboration with HuntertechGlobal

© copyright 2017 All Rights Reserved.

A Product of HunterTech Ventures