To gain a balanced perspective of where India stands in the Information Technology (IT) race today, it is considered vital to recount key factors which include; Software Technology State, World Trade Agreement constraints after 1997, semiconductors – life blood of IT industry; superfast computers; personal computers, Laptops and tablet market.
Opportunities & Challenges
The IT industry offers opportunities and also challenges. The global market size of IT services and e-commerce sector was $3,938.75 billion in 2022. India’s share is only over $227 billion. The global “Chips” industry is likely to grow nearly US $600 billion by the end of 2022. Similarly, the global “Chips” industry is likely to grow nearly US$600 billion by the end of 2022. On paper, it is planned to grow at a compounded annual rate of 19 per cent to $300 billion by 2026, which appears too ambitious. Also, the global computers market grew from $369.94 billion in 2021 to $416.79 billion in 2022. India accounts for a fraction of that at around $11 billion. There is a clearly a need to launch bold initiatives and incentives.
Where it all began
IT origins are traced to Aristotle (384 to 325 BC) whose thoughts of ideas inspired Leibniz’s dream of a “universal language” that subsequently led to the emergence of mathematical logic. This was pioneered by George Boole and Gotlob Frege in the 19th century. However, the IT revolution would not have been possible without the invention of “Zero”. The first recorded zero appeared in Mesopotamia around 3 B.C. India did not lag behind. Between 300 and 200 BC, Pingala was the first who used ‘shunya’ for zero as a Sanskrit word and developed the binary number system that is a place-value system. Aryabhata (476-550 AD) invented the number “0” as a placeholder and in algorithms for finding square roots and cube roots. Later in AD 628, Brahmagupta defined zero for the first time and developed a symbol for it which is a dot underneath the numbers. The creation of the zero led to algebra and calculus that form the basis for computers and IT.
“Technology for me is to discover, learn, evolve and implement. It combines 3Ss- speed, simplicity and service. Technology is fast, technology is simple and technology is a brilliant way to serve people. It is also a great teacher. The more we learn about technology and the more we learn through technology, the better it is.”
– Narendra Modi, Prime Minister
In 1930s, the evolution of computer science from mathematical logic culminated with two landmark papers: Claude Shannon’s “A Symbolic Analysis of Switching and Relay Circuits,” and Alan Turing’s “On Computable Numbers, With an Application to the Entscheidungs problem.” The primary reference was a 90-year-old work of mathematical philosophy, George Boole’s “The Laws of Thought.” Many programming languages are called Boolean. Turing’s paper defined the template for computer design. Shannon’s adviser, Vannevar Bush, built a prototype computer known as the Differential Analyser that rapidly calculated differential equations.
At the dawn of 1947, the transistor was invented at Bell Labs, Transistors dramatically improved versions of Shannon’s electrical relays — the best known way to physically encode Boolean operations. IBM enjoyed absolute monopoly from 1950 to 1977 to sell and maintain computers. IBM sold the first few computers to India: in 1956 a CDC 3600 to Indian Statistical Institute (ISI); in 1964, CDC computer to Tata Institute of Fundamental Research; in 1965 an IBM 1620 and in 1966 and IBM 7044 to IIT Kanpur. Thereafter, TATA started TCS with 3 computers purchased from IBM.
Software Growth in India
IBM decided to exit India in 1977 in protest against new FERA and FEMA regulations. Meanwhile, the Department of Electronics (DOE) incorporated ‘Computer Management Corporation Private Limited’ (CMC) in December 1975. CMC’s role was to maintain computers handed over by IBM. CMC was converted in August 1977 into a public limited company wholly owned by the Government of India. The success story of the CMC, which ventured into writing software and won the contract to build the Indian Railways reservation system, created a positive perception about computers and served as a model for banks and Air India automation projects. Thereafter, in 2001, the CMC was divested to the Tata Consultancy Services (TCS) and privatised. Meanwhile in 1977, (NCSDCT) National Centre for Software Development and Computing Techniques, was carved out from Tata Institute of Fundamental Research (TIFR), which was first to demonstrate Wide Area Network when it linked computers at TIFR and VJIT Mumbai using Bombay Telephone lines. Later the Department of Electronics (DoE) commissioned ERNET (Education and Research Network) modelled on ARPANET to connect 5 IITs, IISc Bangalore, NCSDCT and DoE.
India’s strategy to lure foreign companies
India’s IT service industry is today again at the crossroads. Many low cost destinations have emerged in the Philippines and South East Asia. Anti-outsourcing wave has swept through the UK and the US. The Indian IT industry will be facing tougher visa norms and hiring locally. There is also a shift in global demand in IT services. In an era of cloud computing there won’t be traditional development and maintenance jobs. Many low end jobs will get automated and the IT industry will have to move up the value chain to continue delivering value to their customers.
In 1997, India became a signatory to the World Trade Organisation’s IT Agreement accepting the conditions of “no duty on software, semiconductors, semiconductor manufacturing and testing equipment, computers, telecom equipment, scientific instruments, as well as most of the parts and accessories of these products”. Consequently, India failed to consolidate and advance the “Hardware” industry, particularly the semiconductor chips industry, that was booming and the lifeblood of production in the ongoing information age. China has set an ambitious target of producing 70 per cent of its chip needs in-house by 2025 — assuming a $220 billion consumption, that is around $150 billion. The Chinese plan is to become self-sufficient in critical technologies by 2025 due to American sanctions depriving it foreign imports. India’s semiconductor market, pegged at $119 billion in 2021, will grow at a compounded annual rate of 19 per cent to $300 billion by 2026. New Delhi’s strategy is twofold — lure in foreign companies and build on areas where India has an advantage, such as chip design.
Supercomputers, Laptops and Smartphones
Next, where does India stand in the “Top 500 Superfast Computers” ranking? The No. 1 spot is now held by the “Frontier system” in the US surpassing the 1 exaflop barrier – the first exascale most powerful supercomputer to ever exist. By comparison with characteristics like R/Peak and R/Max T/Flops, Indian superfast computers in the Top 500 List further pale into insignificance. As of November 2022, India stands jointly with three other nations at 13th place with four Superfast computers, which in total numbers of superfast computers is woefully insignificant compare to China with 173, the US with 149, Japan with 32, Germany with 26, France with 19 and Canada and the UK with each.
In 1970, the Govt created Department of Electronics (DoE) to regulate electronics industry. One of the mandates of DoE was to build indigenous computers. DoE also introduced license raj in the electronics industry resulting in ‘stunted’ growth of hardware. In 1971, the Government did not approve Delhi Cotton Mill (DCM) proposal to collaborate with Japanese giant Sony to build calculators. However, DCM built the first indigenous calculator in 1972. In 1975 DCM launched India’s first microprocessor.
National Supercomputing Mission
In 2015, the launch of the National Supercomputing Mission (NSM) boosted the pace of Indian supercomputers. NSM announced a seven-year programme worth Rs 4,500 crore to install 73 indigenous supercomputers by 2022. However, as of May 2022, there are 15 superfast computers only in India. Of them, four are in the Top 500 Global Ranking list to include: PARAM Ananta (Global Ranking 102) developed under the NSCM by C-DAC and IIT Gandhinagar, that has only a peak performance of 3.3 Petaflops; 111 – PARAM Siddhi-AI at C-DAC (R/Peak – 5,267.1); 132 – Pratyush (Cray XC40) at Indian Institute of Tropical Meteorology, Pune (R/Peak – 5,267.1) and at 249 place – Mihir (Cray XC40) at the National Centre for Medium Range Weather Forecasting, Noida, (R/Peak – 2,808.7). Add to the four, there are other supercomputers today in India to include: PARAM Shivay at IIT, BHU; PARAM Shakti at IIT-Kharagpur; PARAM Brahma at IISER Pune; PARAM Yukti at JNCASR, Bengaluru; PARAM Sanganak at IIT Kanpur; PARAM-ISHAN at IIT Guwahati; and PARAM Pravega at the IISc, Bengaluru etc.
In 2020 the Production Linked Incentive (PLI) Scheme was launched. The outlay for mobile phone PLI is Rs 40,951 crore over 5 years, with incentives ranging between 4-6% annually. The Government’s strategy, combined with a huge domestic market, helped India to become the world’s 2nd-biggest mobile phone producer after China.
In 2021, the PLI scheme, worth Rs 7,350 crore was launched to boost local manufacturing and exports of IT products for laptops, tablets, all-in-one personal computers and servers with a view to cut imports, especially from China. In the first year (FY22) hardware production is Rs 2,000 crore only. And, the results have been lackluster. Only four of the 14 eligible players under the scheme in the first year of operations have succeeded in meeting their production targets and will receive incentives. Only 18 per cent of all PCs sold in India are now manufactured locally.
Currently, the 4th Industrial Revolution is sweeping, known as the Information Age or Imagination Age or as the Computer Age or Digital or Silicon Age and New Media Age, beginning in the mid- 20th Century. And, there is paradigm shift from traditional industry to an economy primarily based upon IT and, the trend towards automation and data exchange in manufacturing technologies and processes which include cyber-physical systems, IoT, industrial internet of things, cloud computing, cognitive computing, and artificial intelligence that offers tremendous opportunities to exploit.