(This is the essay I wrote for Cisco IT Scholarship 2007. The topic was “A Career in Information Technology — Opportunities for Innovation and Challenges.”)
“Most academic scientists are aware that computer databases exist, but what bothers me is the smallness of the fraction that actually uses them.”
(Source: “The Challenge of IT: Proc. of the 41st FID Congress,” by KR Brown)
…And what was true of academic scientists two decades back is almost certainly valid for all of us — the potential users of ‘Information Technology’ that we proudly are. Why should this be? I begin with a contention that the “problems” of IT industry are not merely technical (hardware/software), but rather of relevance, of user behavior and of perceived costs. IT includes much more than computers and communication.
For a prospective professional, IT field has far more opportunities, well-packaged with challenges, to offer than any other. Consider “information” as an integrated approach to enable people in problem-solving and you will not find yourself in competition with other professions. I believe this societal commitment on him/her is, in principle, justified.
Waiting in the wings eager to jump into the bandwagon, I mull over research in IT as a plausible career option. And this essay reflects just why.
I’d position the following eight areas as the ‘cornerstones’ of IT in this era of Internet. Data Storage — Storage Area Networks being the latest buzzword; Semiconductor Chips – with raw computing power shooting by the day; Software — latest battleground for computer wars; Parallel/ Distributed Architectures — coming together of systems and networks for want of speed; Natural Language Processing — talking with IT; Fiber Optics/ Wireless — for efficient data communication; Biotechnology — data-crunching to sort DNA strings or test new drugs; and IT-Enabled Services — change through e-commerce and e-governance. Scope for innovation in every of these areas is astronomical. It’s heartening to note that cutting-edge research is happening on a global scale, and happening for sure!
The sudden dawn of Personal Computer created an industry-standard blueprint for hardware and a compatible operating system. After PC created waves, IT industry now embarks on the next computing revolution — a world in which intelligent devices will connect people, businesses and information. I seize this opportunity to delve deep into my area of interest — ‘Artificial Intelligence’. Of late, AI has triggered my curiosity, has made me sit in amazement and awe.
To put things in perspective, I present a case study shot in a business environment. While an algorithm-based search engine for a company catalog would demand from the customer the complete set of specifications as search strings (an expectation too imposing), a ‘case-based reasoning’ intelligently arrives at a decision when provided with a partial set. AI is ‘the behavior by a machine that, if performed by a human being, would be called intelligent.’ This apparent thinking power of a computer is exploited to the hilt in applications like expert systems, pattern recognizers, and intelligent networks and so on. Intelligent agents toil for their masters just as humans. Still at a nascent stage though is the research into an Automatic Programmer, which codes on behalf of the developer. The prospect of a human-like ‘thinking’ machine can throw open unforeseen vistas to the mankind in general and computing in particular. Fits the bill, just right, for the sort of career I’d love to don (!)
Despite the gray warnings about how the computer world inevitably dehumanizes us, it has not till date. We are as human as ever. Let us seize the opportunity and live up to the challenges that IT has provided, and not be dodos – delightful but extinct. It reminds me of a proverb “Man will wait for a long time with his mouth wide-open, before a duck flies in.”
Duck is worth going and getting.
~ Vikram Subramanya