computer fundamentals

Published on June 2016 | Categories: Documents | Downloads: 39 | Comments: 0 | Views: 335
of 39
Download PDF   Embed   Report

computer fundamentals

Comments

Content

VIRUSES AND ANTI-VIRUS SOFTWARE Viruses are malicious pieces of software, designed to spread and cause damage to programs and files on computers. Some viruses are merely annoying, displaying useless error messages. Other viruses can destroy your data and prevent your computer from working. Most of these viruses will operate transparently, so you won’t even know that they’re causing damage. As computer systems become more complex, virus writers have been able to craft new viruses and cause problems on a global scale. In recent years, there have been reports of companies losing millions of dollars thanks to employees being unable to do their jobs due to computer viruses. In these days of high-speed Internet connections and e-mail everywhere, everyone needs to know how to protect themselves from virus outbreaks. Types of Viruses: Because viruses do not all act in the same way, let’s define the different types of viruses:  Virus: a program that replicates and infects another program by inserting or attaching itself; basically “piggybacking” on files already present on your computer.  Trojan horse: a program that does not replicate or copy itself, but causes damage by tricking you into opening an infected file.  Worm: a program that makes copies of itself specifically intended to be distributed to other computers it can reach, such as via e-mail or network connections. How does a virus spread? Viruses spread through a variety of methods. The most common methods are:  E-mail attachments  Instant Messenger or Internet Relay Chat (IRC) file transfers and web links  File downloads from hacked or untrustworthy web sites  Using infected floppy disks, CD-ROMs, USB keychain drives, etc.  Insecure computers being hacked and configured to send out viruses What is an anti-virus program? An anti-virus program is designed to protect your computer from virus infections. Anti-virus products seek out viruses by comparing your files against a database of known viral threats, and will identify files that are suspicious or actual viruses. The program will then helps you determine how best to deal with these threats, including trying to repair an infected file, or deleting it in a
1

safe manner. Every anti-virus product will look and feel differently, so it is extremely important to become familiar with the operation of the program you choose to install on your own computer. FIREWALLS Firewalls help to protect your computer from outside attack, and make your computer much safer while you’re connected to the Internet. With a properly configured firewall running, your computer will be virtually invisible to the forces trying to breach your computer’s security. To explain why this is important, let’s first talk about what a firewall does while it’s running. Definition: firewall – A program or device that blocks un-requested communications from the Internet, preventing your computer from responding to potentially malicious attempts to gain access. In its most basic form, a firewall places an electronic barrier between you and the Internet. This barrier examines the communications traffic going between your computer and the network. A firewall works by examining the details of “packets” of Internet traffic. If the packet appears to be safe, the firewall will allow you to receive the information. If it’s not safe, or if it is not something you requested, the firewall will block the packet, and you may receive a notification. There are two types of firewalls:  Software firewall: a program that runs only on your computer. A software firewall is built-in to your computer’s operating system (Windows XP with Service Pack 2).  Hardware firewall: a piece of equipment that is designed to operate as a firewall, or includes firewall functions (such as many commonly-used home routers). SPYWARE AND ANTI-SPYWARE SOFTWARE Spyware and adware are an increasingly common threat, and one that is often not well understood by computer users. Spyware and adware programs, once installed on your computer, are often much harder to remove than viruses. They will often cause your computer to become sluggish, making it almost impossible to complete everyday tasks. These programs can be both a nuisance and a serious threat to your privacy and computer’s security. Definition: spyware – a program that hides itself and runs on your computer, collecting data about you. They typically collect such information as your credit card numbers, the websites you’ve visited, or even your passwords. This data is then transmitted to a company or individual on the Internet.
2

Definition: adware – a program that hides itself and runs on your computer, showing pop-up advertisements at random times, even when you’re not connected to the Internet. These programs are typically things that sound like they might be helpful in some way, such as browser toolbars, “electronic wallets,” or other similar helpers. The names and descriptions are often intentionally misleading, to lull you into a false sense of safety. E-mail Spam Filtering A common complaint about e-mail service is the ever-increasing amount of spam messages. Spam email is both a nuisance and a potential threat to your privacy, so it’s smart to understand why you receive it and what you can do to get rid of it. By automating the removal of spam from your Inbox, you can spend more time reading your e-mail and less time cleaning up your account. Definition: spam – unsolicited bulk e-mail messages, typically intended to scam Internet users out of money. Like bulk mail advertising through the postal service, these messages do not have a specific target audience, and can come from anywhere in the world. Most spam messages are not directed at any one specific person. They are sent to thousands of e-mail addresses, in the hope that someone will decide to respond. A typical spam mail message will have a meaningless subject line, often comprised of random words. Because most of these spam mails are scams, it’s usually best to ignore or delete them. Phishing Phishing attacks are an increasingly common way for criminals to violate your privacy and safety. These scams target inexperienced and careless computer users, in an attempt to obtain your private information. Phishers are interested in tricking you into revealing such information as:  Your username and password information for certain websites  Your full name, home address, telephone number, social security number  Your credit card information Definition: phishing (pronounced “fishing”) – unsolicited e-mail messages, warning you to update your account or security information at a particular website. These messages may have the appearance of legitimacy, but are in fact attempts to steal your private information. Phishing scams rely upon deception and the faith of inexperienced Internet users. Phishing e-mails are specifically designed to look like legitimate notifications from various companies (including banks, online retailers, auction websites, online payment services, etc.), with the intent of fooling you into action. These e-mail messages
3

almost always direct you to visit a similarly fake website, where you are then coaxed to divulge your personal information. Computer security techniques Some of the techniques include:  The principle of least privilege, where each part of the system has only the privileges that are needed for its function. That way even if n attacker gains access to that part, they have only limited access to the whole system.  Automated theorem proving to prove the correctness of crucial software subsystems.  Code reviews and unit testing are approaches to make modules more secure where formal correctness proofs are not possible.  Defense in depth, where the design is such that more than one subsystem needs to be violated to compromise the integrity of the system and the information it holds.  Default secure settings, and design to “fail secure” rather than “fail insecure” .Ideally, a secure system should require a deliberate, conscious, knowledgeable and free decision on the part of legitimate authorities in order to make it insecure.  Audit trails tracking system activity, so that when a security breach occurs, the mechanism and extent of the breach can be determined. Storing audit trails remotely, where they can only be appended to, can keep intruders from covering their tracks.  Full disclosure to ensure that when bugs are found the “window of vulnerability” is kept as short as possible. MALWARE Malware, short for malicious (or malevolent) software, is software used or programmed by attackers to disrupt computer operation, gather sensitive information, or gain access to private computer systems. It can appear in the form of code, scripts, active content, and other software. ‘Malware’ is a general term used to refer to a variety of forms of hostile or intrusive software. Malware includes computer viruses, ransomware, worms, trojan horses, rootkits, keyloggers, dialers, spyware, adware, malicious BHOs, rogue security software and other malicious programs. The majority of active malware threats are usually worms or trojans rather than viruses. In law, malware is sometimes known as a computer contaminant. Malware is different from defective software, which is legitimate software but contains harmful bugs that were not
4

corrected before release. However, some malware is disguised as genuine software, and may come from an official company website in the form of a useful or attractive program which has the harmful malware embedded in it along with additional tracking software that gathers marketing statistics. Software such as anti-virus, anti-malware, and firewalls are relied upon by users at home, small and large organizations around the globe to safeguard against malware attacks which helps in identifying and preventing the further spread of malware in the network. Computer virus A computer virus is a computer program that can replicate itself and spread from one computer to another. The term “virus” is also commonly, but erroneously, used to refer to other types of malware, including but not limited to adware and spyware programs that do not have a reproductive ability Ransomware (also referred to in some cases as cryptoviruses, cryptotrojans, cryptoworms or scareware) comprises a class of malware which restricts access to the computer system that it infects, and demands a ransom paid to the creator of the malware in order for the restriction to be removed. Some forms of ransomware encrypt files on the system’s hard drive, while some may simply lock the system and display messages intended to coax the user into paying. Computer worm Computer worm is a standalone malware computer program that replicates itself in order to spread to other computers. Often, it uses a computer network to spread itself, relying on security failures on the target computer to access it. Unlike a computer virus, it does not need to attach itself to an existing program. Worms almost always cause at least some harm to the network, even if only by consuming bandwidth, whereas viruses almost always corrupt or modify files on a targeted compute Trojan horse A Trojan horse, or Trojan, is a non-self-replicating type of malware which appears to perform a desirable function but instead drops a malicious payload, often including a backdoor allowing unauthorized access to the target’s computer. These backdoors tend to be invisible to average users. Trojans do not attempt to inject themselves into other files like a computer virus. Trojan horses may steal information, or harm their host computer systems. Trojans may use drive-by downloads or install via online games or internet-driven applications in order to reach target computers. The term is derived from
5

the Trojan Horse story in Greek mythology because Trojan horses employ a form of “social engineering,” presenting themselves as harmless, useful gifts, in order to persuade victims to install them on their computers rootkit A rootkit is a stealthy type of software, often malicious, designed to hide the existence of certain processes or programs from normal methods of detection and enable continued privileged access to a computer. The term rootkit is a concatenation of “root” (the traditional name of the privileged account on Unix operating systems) and the word “kit” (which refers to the software components that implement the tool). The term “rootkit” has negative connotations through its association with malware. Rootkit installation can be automated, or an attacker can install it once they’ve obtained root or Administrator access. Obtaining this access is a result of direct attack on a system (i.e. exploiting a known vulnerability, password (either by cracking, privilege escalation, or social engineering). Once installed, it becomes possible to hide the intrusion as well as to maintain privileged access. The key is the root/Administrator access. Full control over a system means that existing software can be modified, including software that might otherwise be used to detect or circumvent it. Keystroke logging Keystroke logging, often referred to as keylogging, is the action of recording (or logging) the keys struck on a keyboard, typically in a covert manner so that the person using the keyboard is unaware that their actions are being monitored. It also has very legitimate uses in studies of human-computer interaction. There are numerous keylogging methods, ranging from hardware and software-based approaches to acoustic analysis. Dialers are necessary to connect to the internet (at least for non-broadband connections), but some dialers are designed to connect to premium-rate numbers. The providers of such dialers often search for security holes in the operating system installed on the user’s computer and use them to set the computer up to dial up through their number, so as to make money from the calls. Alternatively, some dialers inform the user what it is that they are doing, with the promise of special content, accessible only via the special number. Spyware Spyware is a software that aids in gathering information about a person or organization without their knowledge and that may send such information to another entity without the consumer’s consent, or that asserts control over a
6

computer without the consumer’s knowledge .“Spyware” is mostly classified into four types: system monitors, trojans, adware, and tracking cookies. Spyware is mostly used for the purposes such as; tracking and storing internet users’ movements on the web; serving up pop-up ads to internet users. Whenever spyware is used for malicious purposes, its presence is typically hidden from the user and can be difficult to detect. Some spyware, such as keyloggers, may be installed by the owner of a shared, corporate, or public computer intentionally in order to monitor users. While the term spyware suggests software that monitors a user’s computing, the functions of spyware can extend beyond simple monitoring. Spyware can collect almost any type of data, including personal information like Internet surfing habits, user logins, and bank or credit account information. Spyware can also interfere with user control of a computer by installing additional software or redirecting Web browsers. Some spyware can change computer settings, which can result in slow Internet connection speeds, un-authorized changes in browser settings, or changes to software settings. Sometimes, spyware is included along with genuine software, and may come from a malicious website. In response to the emergence of spyware, a small industry has sprung up dealing in anti-spyware software. Running anti-spyware software has become a widely recognized element of computer security practices for computers, especially those running Microsoft Windows. Browser Helper Object (BHO ) A Browser Helper Object (BHO) is a DLL module designed as plugin For Microsoft’s Internet Explorer web browser to provide added functionality. The BHO API exposes hooks that allow the BHO to access the Document Object Model (DOM) of the current page and to control navigation. Because BHOs have unrestricted access to the Internet Explorer event model, some forms of malware have also been created as BHOs. Rogue security software Rogue security software is a form of Internet fraud using computer malware (Malicious software) that deceives or misleads users into paying money for fake or simulated removal of malware (so is a form of ransomware)—or it claims to get rid of malware, but instead introduces malware to the computer CRYPTOGRAPHY Cryptography is the practice and study of techniques for secure communication in the presence of third parties (called adversaries). More generally, it is about constructing and analyzing protocols that overcome the
7

influence of adversaries and which are related to various aspects in information security such as data confidentiality, data integrity, authentication, and nonrepudiation. Modern cryptography intersects the disciplines of mathematics , computer science, and electrical engineering. Applications of cryptography include ATM cards, computer passwords, and electronic commerce. Cryptography is the science of using mathematics to encrypt and decrypt data. Cryptography enables you to store sensitive information or transmit it across insecure networks (like the Internet) so that it cannot be read by anyone except the intended recipient. While cryptography is the science of securing data, cryptanalysis is the science of analyzing and breaking secure communication. Classical cryptanalysis involves an interesting combination of analytical reasoning, application of mathematical tools, pattern finding, patience, determination, and luck. Cryptanalysts are also called attackers. Cryptology embraces both cryptography and cryptanalysis. A cryptographic algorithm, or cipher, is a mathematical function used in the encryption and decryption process. A cryptographic algorithm works in combination with a key—a word, number, or phrase—to encrypt the plaintext. The same plaintext encrypts to different ciphertext with different keys. The security of encrypted data is entirely dependent on two things: the strength of the cryptographic algorithm and the secrecy of the key. A cryptographic algorithm, plus all possible keys and all the protocols that make it work comprise a cryptosystem. In conventional cryptography, also called secret-key or symmetric-key encryption, one key is used both for encryption and decryption. The Data Encryption Standard (DES) is an example of a conventional cryptosystem that is widely employed by the Federal Government. Figure 1-2 is an illustration of the conventional encryption process.

DIGITAL SIGNATURE A digital signature or digital signature scheme is a mathematical scheme for demonstrating the authenticity of a digital message or document. A valid
8

digital signature gives a recipient reason to believe that the message was created by a known sender, such that the sender cannot deny having sent the message (authentication and non-repudiation) and that the message was not altered in transit (integrity). Digital signatures are commonly used for software distribution, financial transactions, and in other cases where it is important to detect forgery or tampering. Digital signatures employ a type of asymmetric cryptography. For messages sent through a non-secure channel, a properly implemented digital signature gives the receiver reason to believe the message was sent by the claimed sender. Digital signatures are equivalent to traditional handwritten signatures in many respects, but properly implemented digital signatures are more difficult to forge than the handwritten type. Digital signature schemes in the sense used here are cryptographically based, and must be implemented properly to be effective. Digital signatures can also provide non-repudiation, meaning that the signer cannot successfully claim they did not sign a message, while also claiming their private key remains secret; further, some non-repudiation schemes offer a time stamp for the digital signature, so that even if the private key is exposed, the signature is valid. Digitally signed messages may be anything representable as a bitstring: examples include electronic mail, contracts, or a message sent via some other cryptographic protocol. Digital signature scheme typically consists of three algorithms:  A key generation algorithm that selects a private key uniformly at random from a set of possible private keys. The algorithm outputs the private key and a corresponding public key.  A signing algorithm that, given a message and a private key, produces a signature.  A signature verifying algorithm that, given a message, public key and a signature, either accepts or rejects the message’s claim to authenticity. FIRE WALL A firewall can either be software-based or hardware-based and is used to help keep a network secure. Its primary objective is to control the incoming and outgoing network traffic by analyzing the data packets and determining whether it should be allowed through or not, based on a predetermined rule set. A network’s firewall builds a bridge between the internal network or computer it protects, upon securing that the other network is secure and trusted, usually an external (inter)network, such as the Internet, that is not assumed to be secure and trusted. Many personal computer operating
9

systems include software-based firewalls to protect against threats from the public Internet. Many routers that pass data between networks contain firewall components and, conversely, many firewalls can perform basic routing functions.

Traditionally, a firewall is defined as any device (or software) used to filter or control the flow of traffic. Firewalls are typically implemented on the network perimeter, and function by defining trusted and untrusted zones: Most firewalls will permit traffic from the trusted zone to the untrusted zone, without any explicit configuration. However, traffic from the untrusted zone to the trusted zone must be explicitly permitted. Thus, any traffic that is not explicitly permitted from the untrusted to trusted zone will be implicitly denied (by default on most firewall systems). A firewall is not limited to only two zones, but can contain multiple ‘less trusted’ zones, often referred to as Demilitarized Zones (DMZ’s). To control the trust value of each zone, each firewall interface is assigned a security level, which is often represented as a numerical value or even color. For example, in the above diagram, the Trusted Zone could be assigned a security value of 100, the Less Trusted Zone a value of 75, and the Untrusted Zone a value of 0. As stated previously, traffic from a higher security to lower security zone is (generally) allowed by default, while traffic from a lower security to highersecurity zone requires explicit permission. Firewall Services Firewalls perform the following services:  Packet Filtering  Stateful Packet Inspection  Proxying  Network Address Translation (NAT) Firewalls primarily provide access control for connections between networks.

10

1 Trusted: this is usually the corporate LAN. It is assumed that all PCs and servers in the LAN are under your administrative control. If users are able to change their IP address and install software at will then 2 Untrusted: the Public Internet, the Firewall’s WAN interface; 3 Partially trusted: the Firewall’s DMZ interface. These are machines under our control, but freely accessible from the Internet. These are not fully trusted because t is assumed that being accessible they will be compromised or hacked at some time.

USERS IDENTIFICATION AND AUTHENTICATION User identification is the process of identifying an end-user who is browsing via the Vital Security system for the following reasons: Authentication – applying the correct policy to the end-user (for example, Security, Logging, and HTTPS policies) Authorization – deciding whether a user is authorized to browse via the system Auditing – Tracing end-user activity through logs, that is, recording (logging) transactions with details for future viewing and analyzing activities performed by the user The following section summarizes the general differences between identification and authentication: Identification – The end-user’s browser initiates a Web session through Vital Security. Vital Security, at this point, can identify the user based on the source IP address or according to credentials, by challenging the user to send the credentials using NTLM or via clear text (basic authentication). If the second option is configured, an authentication handshake is performed between the end-user’s browser and the Vital Security system, based on one of the preselected authentication types (basic or NTLM). If the user is already authenticated in the network, the end-user’s browser will automatically send the required credentials to the VS system (In environments other than Windows, a pop-up window appears asking the user for credentials). Otherwise, a window dialog will pop up, requesting user credentials (username and password). Vital Security tries to locate the username in its pre-loaded users list (imported from AD/LDAP Directory). If the username is found, the policy that is assigned to the user will be enforced on that session (either a
11

dedicated policy or a group policy). If the username is not found, the user will be assumed to be an unknown user and is given the security policy that is assigned to the Unknown Users group. Authentication – Authentication is used when the user/domain information obtained from the end-user is validated via an external authentication server (for example, Active Directory). When real user authentication is required, the authentication is performed through the use of an authentication service running on one of the Finjan devices (Scanning Server or Gateway Device), which communicates with an authentication server to validate that the username/password supplied by the end-user’s browser is actually participating in the supplied domain. Using the Gateway Device to perform user authentication has four main benefits:  Performance – The Scanning Server does not need to handle the entire authentication (which requires several HTTP transactions); therefore, the overall performance of the system is higher. Security – The Gateway Device is installed in the LAN, and all users’ credentials remain in the LAN.  High Availability and Scalability – The Gateway Device can be installed in a cluster (using a third-party load balancer) for high availability and scalability when better performance is required. SECURITY AWARENESS Information Security is the protection of information in opposition to fault, disclosure and manipulation. It is commonly accepted that the majority of the security violations are due to human interaction rather than technology fault. Yet, companies depend and grant a lot of consideration to technology and usually forget participation of human beings in the system. Usually organizations use best of the best products and technology for the protection of information and infrastructure. They ignore human’s contribution and role in securing organization assets. Actually companies make this mistake and relate information security with the products and technology although it is a process which needs human interaction and involvement. There is no such thing as 100% security but we try to maximize its level through an awareness program and human involvement in the process. A simple definition of the three security pillars is as follows. If anyone of them is missing then it’s a flaw and is against the information security measures. Confidentiality: It means only authorized people can see information e.g. you are the only one authorized to see your bank statement.
12

Integrity: It ensures that information has not been changed either in transit or while in storage. It means only authorized people can change the information e.g. you can see bank statement but not authorized to change it according to your wishes. Availability: It means information is available when and where it is needed e.g. you can get money from ATM machine when you want to buy things. Information Security Awareness is user’s education and awareness to handle information security threats and minimize their impact. Awareness program basically focuses attention on information security issues like confidentiality, integrity and availability. It highlights the importance of these factors, their role in business and finally concentrates on how to behave with them in a confident way. Information Security awareness is a method used to educate people in the organization. It highlights the importance of information, threats to that information and staff’s contribution in implementing policies and procedures for the protection of information. Awareness program is an attempt to change the behavior of employees towards systems and processes in the organization. It teaches what needs to be protected, against whom and how. In today’s business environment most of the companies rely on electronically exchanged information. It is a requirement of all the departments to produce and pass information across different departments in a quick and secure manner to support their business decisions. Information plays an important role in making decisions. Therefore commercial companies and even the government departments have different classification of data based on its importance and use. Business success depends upon continuity of operations and information provided to the business processes by information systems. The growth, excellence and efficiency of the business could be damaged due to the threats and misuse of information. Therefore, awareness program basically helps, set measures and educate users on how to behave and get benefit out of information without jeopardizing its confidentiality, integrity and availability. The employees are the primary users of the information. A lack of awareness and mishandling of information could expose this information to competitors or get corrupted. If this information is freely available the following could be some of the impacts on the company and its business functions: program on a continuous basis and be conscious in protecting its information assets. Implementing a strong information security awareness program can be a very

13

effective method to protect critical business secrets and it will help employees to understand:  Why they need to take information security seriously  What they gain from active participation and support  How a secure environment helps them complete their assigned tasks Here are some of the methods to convey security awareness message across the organization:  Information Security awareness training  Computer based information security awareness  Awareness services and reminder tools COMPUTER SECURITY POLICIES A computer security policy defines the goals and elements of an organization’s computer systems. The definition can be highly formal or informal. Security policies are enforced by organizational policies or security mechanisms. A technical implementation defines whether a computer system is secure or insecure. These formal policy models can be categorized into the core security principles of: Confidentiality, Integrity and Availability. For example the Bell-La Padula model is a confidentiality policy model, whereas Biba model is an integrity policy model. Why you need a security policy The essence of security policy is to establish standards and guidelines for accessing your corporate information and application programs. Typically, organizations start with informal and undocumented security policies and procedures; but as your enterprise grows and your workforce becomes more mobile and diverse, it becomes especially important — even necessary — for your security policies to be documented in writing. Doing so will help avoid misunderstandings and will ensure that employees and contractors know how to behave. It will provide explicit guidelines for your security staff, making it easier for them to enforce security policies consistently. Furthermore, a security policy facilitates internal discussion about security, and helps everyone become more aware of potential security threats and associated business risks. You’ll find that a written policy generally tends to enhance the performance of your security systems — and the e-business applications they support. It provides the guidelines necessary in determining the proper configuration of

14

systems and a bar against which you can measure the effectiveness of your security efforts. A security policy is a formal statement of the rules that employees and others must follow when using your company’s computer systems and networks. Its purpose is to make everyone aware of their responsibilities for protecting your organization’s information assets, and it should specify the details of how to do so. Developing the security policy involves assessing your information assets, security threats, and business risks; putting appropriate policies into written form; implementing those policies; and then repeating that cycle — updating your written security policy on a regular basis and ensuring that everyone in your organization remains mindful of their information security obligations TRENDS IN THE INFORMATION TECHNOLOGY 21st century has been defined by application of and advancement in information technology. Information technology has become an integral part of our daily life. According to Information Technology Association of America, information technology is defined as “the study, design, development, application, implementation, support or management of computer-based information systems.” Information technology has served as a big change agent in different aspect of business and society. It has proven game changer in resolving economic and social issues. Advancement and application of information technology are ever changing. Some of the trends in the information technology are as follows: Cloud Computing One of the most talked about concept in information technology is the cloud computing. Clouding computing is defined as utilization of computing services, i.e. software as well as hardware as a service over a network. Typically, this network is the internet. Cloud computing offers 3 types of broad services mainly Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). Some of the benefit of cloud computing is as follows:  Cloud computing reduces IT infrastructure cost of the company.  Cloud computing promotes the concept of virtualization, which enables server and storage device to be utilized across organization.  Cloud computing makes maintenance of software and hardware easier as installation is not required on each end user’s computer.

15

Some issues concerning cloud computing are privacy, compliance, security, legal, abuse, IT governance, etc. Mobile Application Another emerging trend within information technology is mobile applications (software application on Smart phone, tablet, etc.).Mobile application or mobile app has become a success since its introduction. They are designed to run on Smartphone, tablets and other mobile devices. They are available as a download from various mobile operating systems like Apple, Blackberry, Nokia, etc. Some of the mobile app are available free where as some involve download cost. The revenue collected is shared between app distributor and app developer. User Interfaces User interface has undergone a revolution since introduction of touch screen. The touch screen capability has revolutionized way end users interact with application. Touch screen enables the user to directly interact with what is displayed and also removes any intermediate hand-held device like the mouse. Touch screen capability is utilized in smart phones, tablet, information kiosks and other information appliances. Analytics The field of analytics has grown many folds in recent years. Analytics is a process which helps in discovering the informational patterns with data. The field of analytics is a combination of statistics, computer programming and operations research. The field of analytics has shown growth in the field of data analytics, predictive analytics and social analytics. Data analytics is tool used to support decision-making process. It converts raw data into meaningful information. Predictive analytics is tool used to predict future events based on current and historical information. Social media analytics is tool used by companies to understand and accommodate customer needs. CURRENT TRENDS IN INFORMATION TECHNOLOGY E-COMMERCE: Electronic commerce, commonly known as ecommerce, is a type of industry where buying and selling of product or service is conducted over electronic systems such as the Internet and other computer networks. Electronic commerce draws on technologies such as mobile commerce, electronic funds transfer, supply chain management, Internet marketing, online transaction processing, electronic data interchange (EDI), inventory management systems, and automated data collection systems. Modern electronic commerce typically uses the World Wide Web at least at one point in
16

the transaction’s life-cycle, although it may encompass a wider range of technologies such as e-mail, mobile devices social media, and telephones as well. Electronic commerce is generally considered to be the sales aspect of ebusiness. It also consists of the exchange of data to facilitate the financing and payment aspects of business transactions. E-commerce can be divided into:  E-tailing or “virtual storefronts” on websites with online catalogs, sometimes gathered into a “virtual mall”  The gathering and use of demographic data through Web contacts and social media  Electronic Data Interchange (EDI), the business-to-business exchange of data  E-mail and fax and their use as media for reaching prospective and established customers (for example, with newsletters)  Business-to-business buying and selling  The security of business transactions INFORMATION SYSTEM SECURITY: Information security (sometimes shortened to InfoSec) is the practice of defending information from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction. It is a general term that can be used regardless of the form the data may take (electronic, physical, etc...). The field of information security has grown and evolved significantly in recent years. There are many ways of gaining entry into the field as a career. It offers many areas for specialization including: securing network(s) and allied infrastructure, securing applications and databases, security testing, information systems auditing, business continuity planning and digital forensics, etc. FIREWALLS: The traditional firewall filters traffic based upon ports and protocols. For example, blocking or allowing the entire port 80 for HTTP traffic or port 443 for HTTPS traffic. It’s an “all-or-nothing” approach. Newer firewall technology can also filter traffic based upon the applications or traffic types traversing these ports. For example, you could open port 80 for only select HTTP traffic, for those specific applications, sites, or services you allow. Think of it as blending the firewall and quality of service (QoS) functionalities into one solution. These application-aware firewalls are commonly cited as a next-generation firewall (NGFW) but they are, basically, a form of a unified threat management
17

(UTM) solution. However, the term UTM is usually applied to products that lack the true application-awareness and are targeted towards the SMB market. UTM products usually offer additional functions over traditional firewalls, such as antivirus, antispam, or even intrusion prevention systems (IPS). KNOWLEDGE MANAGEMENT: Knowledge management (KM) comprises a range of strategies and practices used in an organization to identify, create, represent, distribute, and enable adoption of insights and experiences. Such insights and experiences comprise knowledge, either embodied in individuals or embedded in organizations as processes or practices Early KM technologies included online corporate yellow pages as expertise locators and document management systems. Combined with the early development of collaborative technologies (in particular Lotus Notes), KM technologies expanded in the mid-1990s. Knowledge management systems can thus be categorized as falling into one or more of the following groups: Groupware, document management systems, expert systems, semantic networks, relational and object oriented databases, simulation tools, and artificial intelligence .More recently, development of social computing tools (such as bookmarks, blogs, and wikis) have allowed more unstructured, self-governing or ecosystem approaches to the transfer, capture and creation of knowledge, including the development of new forms of communities, networks, or matrixed organizations. Software tools in knowledge management are a collection of technologies and are not necessarily acquired as a single software solution. Furthermore, these knowledge management software tools have the advantage of using the organization existing information technology infrastructure. Organizations and business decision makers spend a great deal of resources and make significant investments in the latest technology, systems and infrastructure to support knowledge management. It is imperative that these investments are validated properly, made wisely and that the most appropriate technologies and software tools are selected or combined to facilitate knowledge management. Knowledge management has also become a cornerstone in emerging business strategies such as Service Lifecycle Management (SLM) with companies increasingly turning to software vendors to enhance their efficiency in industries including, but not limited to, the aviation industry.7 trends that are influencing how organisations are grappling with knowledge management 1. Convergence - According to author Heather Creech, knowledge concepts and practices for international organisations have emerged out of a
18

2.

3.

4.

5.

6.

7.

“cross fertilization” of management approaches in the private sector, innovation in the uses of information and communication technologies (ICTs), and processes for addressing international development through consultative approaches. Transition from the storage and retrieval of information to active engagement with the knowledge user - “KM has moved well beyond the systematic collection, archiving and retrieval of information. Merged into KM are concepts of dialogue, relationship-building and adaptive learning through constant interaction with users, who have their own knowledge and perspectives to contribute.” Shifting emphasis from knowledge to influence - “Knowledge management practice now includes the creation of internal communities to foster face-to-face and e-mail interaction among staff. A new focus on social capital and social networks - “Organizations are now looking at the tools and training for staff to map their existing social networks and to understand how to build ‘social capital’ with their colleagues, clients and audiences. Social network analysis is the mapping and measuring of how knowledge flows through these relationships.” Open source/open content: addressing the democratisation of knowledge -sharing - Again, the open source concept (which involves the release among computer programmers of source code for others to work with and adapt freely) is of growing interest to knowledge-based organisations. He indicates that this practice has evolved into “open content”, which is describes as “an ideology of collaboration that grants broader rights for sharing and using new ideas and practices. The adoption of different modalities - “There is growing interest in how collaboration among groups of people can be ‘governed’ rather than ‘managed.’ An organization creates the space for knowledge-sharing through providing leadership and resources, and through clear articulation of roles and expectations, and then lets the emerging community run itself. Adaptive management - “There is an increasing recognition that for learning to be transformational, there have to be mechanisms for monitoring work, relationships and knowledge exchanges as they progress....There is a growing trend within organizations towards more informal ‘lessons learned’ cycles, where knowledge gained is more rapidly and easily shared, and work adjusted accordingly. In general,
19

there is growing acknowledgement that organizational cultures of adaptation need to be developed in order to respond more readily to changing circumstances.” GIS: Current Trends in GIS A Geographic Information System (GIS) is a system designed to capture, store, manipulate, analyze, manage, and present all types of geographical data. The acronym GIS is sometimes used for geographical information science or geospatial information studies to refer to the academic discipline or career of working with geographic information systems and is a large domain within the broader academic discipline of Geoinformatics. In the simplest terms, GIS is the merging of cartography, statistical analysis, and computer science technology. Many disciplines can benefit from GIS technology. An active GIS market has resulted in lower costs and continual improvements in the hardware and software components of GIS. These developments will, in turn, result in a much wider use of the technology throughout science, government, business, and industry, with applications including real estate, public health, crime mapping, national defense, sustainable development, natural resources, landscape architecture, archaeology, regional and community planning, transportation and logistics. GIS is also diverging into location-based services, which allows GPS-enabled mobile devices to display their location in relation to fixed assets (nearest restaurant, gas station, fire hydrant), mobile assets (friends, children, police car) or to relay their position back to a central server for display or other processing. These services continue to develop with the increased integration of GPS functionality with increasingly powerful mobile electronics (cell phones, PDAs, laptops).The following are the major trends that we see affecting GIS: Platforms. Computing platforms continue to proliferate and diversify, and ESRI is working to provide solutions across all of these platforms. “Pervasive mapping,” which is enabled by this proliferation of platforms, gives anyone the ability to use GIS, from anywhere. Our goal is a single, unified experience regardless of the platform you use. Crowdsourcing. Crowdsourced data, initially met with skepticism and concern by the geospatial community, is now going mainstream. GIS practitioners have long been the keepers of “authoritative” data, and are now beginning to take crowdsourced data very seriously. This is in part due to the tremendous utility of crowdsourced data we’ve seen during recent disasters. Crowdsourced data enriches GIS, and is looking at ways our users can use, manage, interpret, and
20

incorporate it into their work. You may be interested in reading a paper titled “Crowdsourcing Critical Success Factor Model.” The Cloud. Cloud computing is rapidly emerging as a technology trend that almost every industry that provides or consumes software, hardware, and infrastructure can leverage. The technology and architecture that cloud service and deployment models offer are key areas of research and. GIS in the Cloud provides opportunities for organizations to become more cost-effective, productive, and flexible, and enables them to rapidly deliver new capabilities. GeoDesign. Maps are a way to abstract place to make it easier to understand. With GeoDesign, we move beyond simply understanding place, to designing it. This more active, engaged, and proactive approach to designing with geography is being enabled by an evolving set of new tools in GIS. Along with web collaboration, scenario management, design iteration, and data sharing tools, ArcGIS continues to evolve as a complete platform for GeoDesign. BCP / BPO: Business continuity planning (BCP) “identifies an organization’s exposure to internal and external threats and synthesizes hard and soft assets to provide effective prevention and recovery for the organization, while maintaining competitive advantage and value system integrity Business process outsourcing (BPO) is a subset of outsourcing that involves the contracting of the operations and responsibilities of specific business functions (or processes) to a third-party service provider. Originally, this was associated with manufacturing firms, such as Coca Cola that outsourced large segments of its supply chain. BPO is typically categorized into back office outsourcing - which includes internal business functions such as human resources or finance and accounting, and front office outsourcing - which includes customer-related services such as contact centre services. BPO that is contracted outside a company’s country is called offshore outsourcing. BPO that is contracted to a company’s neighboring (or nearby) country is called nearshore outsourcing. Variations in current technologies and increased domain expertise across verticals are sure to help BPO race ahead of other sectors in 2013. Technology and platform-based BPO are going to be one of the biggest trendsetters of the year 2013. ?Adopting Cloud and mobile collaborations, requirement of multi locations and better hold on autonomous technology structures are here to stay throughout 2013. Cloud-based BPO would mean a lot of hard work on client’s as well as provider’s side in transferring huge amount of data from client’s infrastructure to third-party (BPO) cloud-enabled platform. The line of
21

distinction between onshore, near-shore and offshore work will diminish when the business becomes more final outcome driven. Flourishing verticals including retail, shipping, shopping and healthcare, to name a few are likely to bring business to India in 2013. This can be attributed to the country’s increasing potential in analytics, finance and R&D. Make analytics your forte’— this should be your work mantra as today’s growing economy analysis has gone to achieve more prominence than before. Analytics driven experiments are failure-proof. Being able to observe a trend manually is a difficult task and technology today does that which humans find difficult to do. Noticing a trend early in the life cycle of a business process is always an added advantage, as decisions taken by the C-level executives are made based on the trends spotted. Vertical BPO is the key to future BPO. Having domain expertise is a way out of ignorance and unawareness. Vertical BPO takes you a step closer to meeting your client’s requirements. Service providers to be replaced by partners: Clients look for smart partners while resolving problems during big data analysis and stiff competition and last but not the least facing wise customers. Smart and enterprising partner will always help you with innovation and fast Rise of small-time business firms: The number of organizations that specialize in offering domain specific expertise to a particular vertical will increase in 2013. It will encourage trust building among clients. The line of distinction between IT and BPO if remains untouched, will bring growth to the sector. Contact centers will be kept open 24x7 round the year: This is so because customers usually get involved with their banks on social media on holidays and financial sector firms remain closed on Sundays. Here, a customer relationship manager needs to understand that s/he must never pursue “we are closed on Sundays”. They should keep themselves updated on social network sites every hour in a day and 365 days in a year. This way BPO sector enables contact centers to offer 24x7 year round services via facebook, twitter .Big Data Analysis: It is a question that bothers both clients and their service providers — how to manage the ever-growing and unstructured data collected from business processes? The deep rooted concern i s t h a t 2 3 percent of the information in digital world is useful however only 3 percent is utilized for data analysis. The other challenge is to match the strength of the employees with the increasing volume of the Big Data. Domestic outsourcing to rise to new horizons: Considering the rise in demand of high wages by Indian and Chinese employees of lately, getting your business done within the country comes as a lucrative
22

option. This way the business will benefit it from being located g e o g r a phi c a l l y ne a r b y the organization. E- BANKING: Online banking has the potential of easing people in the greatest possible way. Since they have the convenience of making financial transactions quite easy, thanks to secure websites which give hassle free time. These websites are dealt by brick-and-mortar, credit unions, and virtual banks along with brokerage houses. Although, it stands out to be the easiest possible way of maintaining the transactions, yet on the same lines, people are equally concerned about hackers accessing their personal information via-Internet. Hence, banks have gone overboard towards providing fool-proof security for their esteemed customers by getting in touch with the latest technological trends 1. Convergence of Mobile and Online Technologies Mobile banking started as a novelty, something only techies and first adopters felt comfortable using. But as smartphones have skyrocketed in popularity over the past few years, mobile banking adoption has increased along with it. 2. The Rise of Business Process Management Both to increase efficiency and ensure regulatory compliance, banks need better methods of gathering and reporting data. Most banks struggle with multiple back-office systems and siloed information. To address these issues in earnest, there will be a large investment in new and improved business process management tools in the year ahead, 3. Goodbye Email, Hello Message Center The abandonment of email for anything sensitive already has begun, and the shift to total reliance on message centers—dedicated web portals designed for secure communication between a bank and its customers—looks to be here to stay. 4. The ‘Tabletization’ of Banking and the User Experience Tablet banking is still a young channel, but it is rife with potential. As with initial mobile forays, it may take banks some period of trial and error to determine how to build the best banking experience for the tablet environment. But most experts agree that the potential for a great tablet banking user experience, especially with the rich interface tablets offer, is nearly unlimited 5. Security increasingly is A Moving Target Mobile devices are becoming nearly omnipresent. Many industry experts and talking heads already have proclaimed the “death of the PC” as consumers increasingly spend their time on their smartphones and tablets, including for
23

their banking needs. But as more people conduct their banking on mobile devices, these devices also will become the growing focus of hackers and fraudsters, who are always on the hunt for ripe targets. 6. Integrating Toward a Brave, New Post-Channel World The days when a customer would walk into a branch to fulfill all of his or her banking needs are long gone. If a customer starts a loan application online and doesn’t have time to complete it, that customer then expects to be able to come into a branch on the way home from work to finish it. However, in too many cases the same customer often will be asked to start the process all over again in the branch, due to a lack of channel integration. Banks will need to better manage the seamless integration of online, offline and mobile channels in 2012 and beyond. E- SPEAK: E-Speak was a middleware technology created by Hewlett-Packard in 1999. The platform is considered to be one of the first pre-SOAP web services platforms and was nominated for a Smithsonian award. The goal of this project was to provide the infrastructure for global service interaction over the Internet E- LOGISTICS – electronic logistics refers to the process which utilizes web technology as an important tool to manage the whole logistic process or some sectors of it E-logistics realizes the utility of electronic technology and integration of logistic organization, trade, management and service modes. It is entitled to share data, knowledge and other information with partners in the supply chain. Seen from these definitions, as e-logistics has been combined with the meaning of commerce, then it is also equipped with the same procedure of commerce from negotiation, contract signoff, payment, and implementation to balance counting and utilizes electronic technology in each step E-GOVERNANCE – E-governance refers to governance processes in which ICT play an active and significant role for efficient and effective governance, making government more accessible and accountable to the citizens. providing new governance services and products E-Governance Challenges are Ability to use required technologies, Consistent and reliable Infrastructure , Coordinating between implementing agencies Performance indicators, standards of e-government. Ensuring access, to useful information and services. Issues of language and communication , Educating citizens , ICT capacity building , Inclusion o f All

24

Technology Trends are Cloud Computing , Changing Programming paradigms Accessibility of information -major requirement in any e-Governance , Semantic web in E-governance -Need to discover, “understand”, and share each other services in Each domain of services , Data fusion - Use of techniques that combine data from multiple sources and gather that information in order to achieve inferences , Dynamic Data fusion methodologies and Stream Databases-Build technology stack and applications on top of them and create an area of expertise , Mashups & Web 2.0/3.0 -Mash-ups and the interfaces and programs developed to support them facilitate: Dynamic content composition, Combining of disparate data sources , Yahoo!Pipes, Microsoft Popfly, and Google Mashup Editor. IT ACT 2000 The Information Technology Act 2000 (also known as ITA-2000, or the IT Act) is an Act of the Indian Parliament (No 21 of 2000) notified on October 17, 2000 Information technology Act 2000 consisted of 94 sections segregated into 13 chapters. Four schedules form part of the Act. In the 2008 version of the Act, there are 124 sections (excluding 5 sections that have been omitted from the earlier version) and 14 chapters. Schedule I and II have been replaced. Schedules III and IV are deleted. Information Technology Act 2000 addressed the following issues: 1. Legal Recognition of Electronic Documents 2. Legal Recognition of Digital Signatures 3. Offenses and Contraventions 4. Justice Dispensation Systems for Cybercrimes MOBILE COMPUTING: Mobile Computing is a technology that allows transmission of data, voice and video via a computer or any other wireless enabled device without having to be connected to a fixed physical link. The main concept involves:  Mobile communication  Mobile hardware  Mobile software 3G 3G or 3rd generation mobile telecommunications is a generation of standards for mobile phones and mobile telecommunication services fulfilling the International Mobile Telecommunications-2000 (IMT-2000) specifications by the International Telecommunication Union. Application services include wide-

25

area wireless voice telephone, mobile Internet access, video calls and mobile TV, all in a mobile environment. GPS (Global Positioning System) The Global Positioning System (GPS) is a space-based satellite navigation system that provides location and time information in all weather, anywhere on or near the Earth, where there is an unobstructed line of sight to four or more GPS satellites. The GPS program provides critical capabilities to military, civil and commercial users around the world. In addition, GPS is the backbone for modernizing the global air traffic system, weather, location services. Long Term Evolution (LTE) LTE is a standard for wireless communication of high-speed data for mobile phones and data terminals. It is based on the GSM/EDGE and UMTS/HSPA network technologies, increasing the capacity and speed using new modulation techniques. Its related with the implementation of forth Generation (4G) technology. WiMax WiMAX (Worldwide Interoperability for Microwave Access) is a wireless communications standard designed to provide 30 to 40 megabit-per-second data rates, with the latest update providing up to 1 Gbit/s for fixed stations. It is a part of a fourth generation, or 4G, of wireless-communication technology. WiMax far surpasses the 30-metre wireless range of a conventional Wi-Fi local area network (LAN), offering a metropolitan area network with a signal radius of about 50 km.WiMax offers data-transfer rates that can be superior to conventional cable-modem and DSL connections, however, the bandwidth must be shared among multiple users and thus yields lower speeds in practice. Near Field Communication Near field communication (NFC) is a set of standards for Smartphones and similar devices to establish radio communication with each other by touching them together or bringing them into close proximity, usually no more than a few centimeters. Present and anticipated applications include contactless transactions, data exchange, and simplified setup of more complex communications such as Wi-Fi. Communication is also possible between an NFC device and an unpowered NFC chip, called a “tag”. Today’s computing has rapidly grown from being confined to a single location. With mobile computing, people can work from the comfort of any location they wish to as long as the connection and the security concerns are properly factored. In the same light, the presence of high speed connections has also
26

promoted the use of mobile computing. Being an ever growing and emerging technology, mobile computing will continue to be a core service in computing and Information Communication and Technology. EMBEDDED SYSTEMS: An embedded system is a computer system designed for specific control functions within a larger system, often with real-time computing constraints. It is embedded as part of a complete device often including hardware and mechanical parts. By contrast, a general-purpose computer, such as a personal computer (PC), is designed to be flexible and to meet a wide range of end-user needs. Embedded systems control many devices in common use today. Embedded systems contain processing cores that are either microcontrollers or digital signal processors (DSP).A processor is an important unit in the embedded system hardware. It is the heart of the embedded system. The key characteristic, however, is being dedicated to handle a particular task. Since the embedded system is dedicated to specific tasks, design engineers can optimize it to reduce the size and cost of the product and increase the reliability and performance. Some embedded systems are mass-produced, benefiting from economies of scale. Physically, embedded systems range from portable devices such as digital watches and MP3 players, to large stationary installations like traffic lights, factory controllers, and largely complex systems like hybrid vehicles, MRI, and avionics. Complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large chassis or enclosure The major building blocks of an embedded system are listed below:  Microcontrollers / digital signal processors (DSP)  Integrated chips  Real time operating system (RTOS) - including board support package and device drivers  Industry-specific protocols and interfaces  Printed circuit board assembly Usually, an embedded system requires mechanical assembly to accommodate all the above components and create a product or a complete embedded device. Embedded systems are deployed in various applications and span all aspects of modern life. Figure details the main application areas of embedded systems. The following section provides an overview of the emerging

27

technological trends and implications in the development of embedded systems.

Multi-core Processors 8-bit controllers were widespread for quite a long time and are still powering a multitude of embedded applications, for instance, in home appliances, smartcards and automotive body electronics. To cater to the need for higher performance, these controllers advanced towards16-bit to 32-bit, as used in routers, cell phones and media players. Wireless For a long time, embedded devices were mostly operating as stand-alone systems. However, with the advent of wireless connectivity, the scenario has changed. Both, short-range wireless protocols like Bluetooth, Zigbee, RFID, near field communications (NFC) and long-range protocols such as, wireless local area network (WLAN), WiMAX, long term evolution (LTE) and cellular communications are bound to witness more widespread applications in the near future. The recent trends in wireless for use in embedded systems are in the areas of system-on-chip (SoC) architecture, reduced power consumption and application of short range protocols.

28

Increased use of open source technology Embedded systems have traditionally employed proprietary hardware, software, communication protocols and home grown operating systems for their development. The payment of royalty to vendors for using a particular operating system has been a significant overhead faced by the manufacturers of embedded systems. Security In an increasingly interconnected world, security in embedded devices has become critical. The security requirements for the huge base of connected embedded devices are distinct on account of their limited memory, constrained middleware, and low computing power. Embedded security is the new differentiator for embedded devices. Progression in the areas of embedded encryption, cryptography, trusted computing and authentication are covered in the following sub-sections. Device Convergence Broadly speaking, any new device being introduced in the cellular, consumer electronics or infotainment segment is a potential candidate for device convergence. So, a mobile phone not only enables one to receive calls but also serves as a camera, PDA, navigation device, music player, texting device and can connect with other devices – a smart phone after all. An automotive infotainment system contains a navigation device, video player, parking enablement, voice controlled applications, internet access devices, lane departure system, GPS connectivity and Bluetooth enabled headphones. Internationalization The devices that are currently manufactured offer rich multimedia support in terms of videos and graphics and hence, demand more processing power, higher resolution and greater bandwidth. Touch screens are on their way to becoming the standard in a variety of devices including PDAs, infotainment systems, ticketing solutions, gaming consoles, mobile phones, music players and hand-held devices Smart Devices Machine to machine (M2M) communication, through both wired and wireless mechanisms is on the rise. While the technology for remote connectivity has been in use for a long time, what is changing now is the business scenario and newer use cases that remote connectivity can be and is being applied to. This is mostly being triggered by the widespread adoption and proliferation of mobile-based communications.

29

DATA MINING AND DATA WARE HOUSING: Data mining (the analysis step of the “Knowledge Discovery in Databases” process, or KDD), an interdisciplinary subfield of computer science, is the computational process of discovering patterns in large data sets involving methods at the intersection of artificial intelligence, machine learning, statistics, and database systems. The overall goal of the data mining process is to extract information from a data set and transform it into an understandable structure for further use. Aside from the raw analysis step, it involves database and data management aspects, data preprocessing, model and inference considerations, interestingness metrics, complexity considerations, postprocessing of discovered structures, visualization, and online updating. Data mining uses intel from past data to analyze the outcome of a particular problem or situation that may arise. Data mining works to analyze data stored in data warehouses that are used to store that data that is being analyzed. That particular data may come from all parts of business, from the production to the management. Managers also use data mining to decide upon marketing strategies for their product. They can use data to compare and contrast among competitors. Data mining interprets its data into real time analysis that can be used to increase sales, promote new product, or delete product that is not value-added to the company. In computing, a data warehouse or enterprise data warehouse (DW, DWH, or EDW) is a database used for reporting and data analysis. It is a central repository of data which is created by integrating data from one or more disparate sources. Data warehouses store current as well as historical data and are used for creating trending reports for senior management reporting such as annual and quarterly comparisons. This definition of the data warehouse focuses on data storage. The main source of the data is cleaned, transformed, cataloged and made available for use by managers and other business professionals for data mining, online analytical processing, market research and decision support .However, the means to retrieve and analyze data, to extract, transform and load data, and to manage the data dictionary are also considered essential components of a data warehousing system. ELECTRONIC DATA INTERCHANGE, Electronic data interchange (EDI) is a method for transferring data between different computer systems or computer networks. It is commonly used by big companies for e-commerce purposes, such as sending orders to warehouses or tracking their order. It is more than mere e-mail; for instance, organizations
30

might replace bills of lading and even cheques with appropriate EDI messages. It also refers specifically to a family of standards. TRENDS IN EDI EDI via VPN A Virtual Private Network (VPN) uses the public telecommunications network to provide private data communications. Most VPNs use the Internet and a variety of specialised protocols to secure their data communications. They tend to follow a client/server configuration where VPN clients authenticate users, encrypt data and manage sessions with VPN servers employing a technique called tunnelling. Web EDI Web EDI is simply conducting EDI through an Internet browser. It replicates paper-based documents as a web form. The form will contain fields where users can enter information. Once all the relevant information is added, it is automatically converted into an EDI message and sent via secure Internet protocols such as File Transfer Protocol Secure (FTPS), Hyper Text Transport Protocol Secure (HTTPS) or AS2. Mobile EDI An emerging area for EDI is within mobile communications. However, EDI applications were not designed with the mobile user in mind. There have been questions about whether a user would want to use a mobile device for completing a purchase order or invoice while out of the office. However, focusing on supply chain efficiencies, it is easier to see the benefits of a sales person being able to see the status of a delivery to a supplier while on the road or a business manager being able to review supplier performance while in a re-negotiation meeting. EDI Outsourcing/ hosted EDI Managed Services is the outsourcing of your EDI program to a third-party provider. Implementing and managing an EDI platform can be daunting for any organisation. Implementing EDI requires access to a broad range of skills and capital investment in hardware and software. Many companies are now looking to integrate EDI with their back-office systems – such as ERP – which results in a demand for internal resources that few organisations can sustain. EDI Software For companies who prefer to retain control of their EDI environment, rather than outsourcing the management of their EDI infrastructure or using hosted

31

EDI solutions, implementing EDI software on a network of PCs behind a company firewall will be the preferred option to take BLUE TOOTH Bluetooth is a wireless technology standard for exchanging data over short distances (using short-wavelength radio transmissions in the ISM band from 2400–2480 MHz) from fixed and mobile devices, creating personal area networks (PANs) with high levels of security. Created by telecom vendor Ericsson in 1994, it was originally conceived as a wireless alternative to RS-232 data cables. It can connect several devices, overcoming problems of synchronization. Bluetooth is managed by the Bluetooth Special Interest Group, which has more than 18,000 member companies in the areas of telecommunication, computing, networking, and consumer electronics. Bluetooth was standardized as IEEE 802.15.1, but the standard is no longer maintained. The SIG oversees the development of the specification, manages the qualification program, and protects the trademarks. To be marketed as a Bluetooth device, it must be qualified to standards defined by the SIG. A network of patents is required to implement the technology and are licensed only for those qualifying devices. Bluetooth wireless technology is incorporated in more personal mobile devices; it enables new uses for those devices. Let us now explore new applications for Bluetooth technology in future... Retail and Mobile e-Commerce: Mobile devices in future can be used as a payment medium for goods and services. Any terminal that is used for retail transactions could incorporate Bluetooth wireless technology and thus connect to other Bluetooth devices to complete retail transactions. For example,  A mobile phone could connect to a soda machine over a Bluetooth link to pay for a soda  Link to a kiosk at which you could buy a theater ticket.  A mobile phone, PDA, or other device could be used to pay for goods and services using Bluetooth communication links with a cash register.  Indeed, through the use of Bluetooth access points, entire shopping malls, arenas, grocery stores, restaurants, and other retail areas could allow customers to perform financial transactions throughout the building. Medical: Three possible applications of Bluetooth wireless technology in the medical domain  Remote patient monitoring
32

 Wireless biometric data  Medicine dispensers Travel: With a personal device that employs Bluetooth wireless communications, a traveler might check in using this device, which could include personal identity credentials, thus eliminating the need to insert a card into a terminal. Moreover, an electronic boarding pass could be issued and stored in the Bluetooth device; that same device could then be used to wirelessly present the boarding pass when boarding the aircraft, eliminating the need for a paper boarding pass. Home Networking:  A mobile phone could be used as a cordless phone via a Bluetooth voice access point (base station).Portable computers could be used at home through wireless dial-up networking or a data access point Futuristic as it might sound, it is already happening. Who would have believed even 15 years ago that just about everyone today would own cellular telephones? Wireless technology holds the promise of transforming hearing aids into multi-functional communication devices, turning hearing loss into hearing gain. So lot is yet to come!!!  Monitors for blood sugar or other medical variables. These Bluetooth devices continuously track and wirelessly report diagnostic information. Over 25 million diabetic Americans, for example, could soon be equipped to have their cellular handsets emit an alarm as the individual’s need for insulin approaches emergency levels.  Wireless monitors and projectors. Prepare a presentation on a tablet computer. Then, as you walk up to a projector, the mobile device wirelessly provides a video feed to the projector and begins to share your creation with everyone in the room.  Payment systems. Payment by means of mobile telephones (PDF link) is already exploding; Starbucks receives over 100,000 mobile payments most days now. The systems still can improve in security and convenience, especially for paperless travel boarding passes, and Bluetooth will probably be part of the answer.  Clothing will communicate. Before long, your clothing may say, “’Want to go for a walk? You’ve been in that chair the last 110 minutes straight,” or “OK: I can turn maroon for tonight.”

33

GLOBAL POSITIONING SYSTEM The Global Positioning System (GPS) is a space-based satellite navigation system that provides location and time information in all weather conditions, anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites. The system provides critical capabilities to military, civil and commercial users around the world. It is maintained by the United States government and is freely accessible to anyone with a GPS receiver A GPS receiver calculates its position by precisely timing the signals sent by GPS satellites high above the Earth. Each satellite continually transmits messages that include  the time the message was transmitted  satellite position at time of message transmission The receiver uses the messages it receives to determine the transit time of each message and computes the distance to each satellite using the speed of light. Each of these distances and satellites’ locations define a sphere. The receiver is on the surface of each of these spheres when the distances and the satellites’ locations are correct. These distances and satellites’ locations are used to compute the location of the receiver using the navigation equations. This location is then displayed, perhaps with a moving map display or latitude and longitude; elevation information may be included. Many GPS units show derived information such as direction and speed, calculated from position changes. GPS shortcomings (some of which are also relevant for all GNSSs) can be summarized as follows: • Weak signal strengths make the signals vulnerable to interference (intentional or unintentional), exposing critical user applications to possible episodes of denial-of service. • GPS in its standard mode cannot be used indoors, under trees, or in urban environments that may reduce the availability of navigation signals. • There is no built-in integrity warning/assurance signal or service for GPS users. • In order to improve the accuracy of GPS to the metre-level or below, relative or differential GPS techniques must be used, leading to more complex (and more expensive) user technology. • Only single-frequency GPS positioning is possible using standard GPS receivers, with expensive dual-frequency GPS receivers justified only for geodetic, survey and scientific applications. • Civilian and military applications both use the same GPS signals.
34

• GPS control is firmly in the hands of the U.S. military (although there is some indirect U.S. civilian input via the Interagency GPS Executive Board, http://www.igeb.gov). • The budget and maintenance of GPS is subject to an annual tussle in the U.S. Congress (though it is embedded within the military budget). • GPS signals do not have a communication capability, hence requiring the integration of separate wireless comms technology for many Telematics-type applications. By the end of the first decade of the 21st Century there will be several GNSSs, each with unique features, each operated by different agencies, and each ‘chasing’ the same user markets. We may speak of generations 1 (GPS and GLONASS), 1.5 (modernized GPS) and 2(Galileo and GPS-III), operating side-by-side over the next 10 or so years. The best outcome would be that although the various GNSSs would be independent of each other (so that one catastrophic failure would not bring the whole GNSS down), they would also be compatible and interoperable so that user equipment may be developed that can utilise some or all of the broadcast signals. LET’S LOOK BRIEFLY AT THE FUTURE GNSS COMPONENTS. GPS Modernization and GPS-III GPS modernization refers to the collection of system improvements (satellite, signal, and control segment) that will change GPS from a 1st generation GNSS to what might be called a1.5G GNSS, principally by offering additional user signals. GPS modernization is focussed on improving the accuracy and addressing signal vulnerability for civilian uses of GPS, primarily through the implementation of a new PRN ranging code on the L2 signal, and a new civilian signal at the L5 frequency of 1227.6MHz. Studies for what we might call the 2 nd generation GPS (the modernization referred to above is essentially an enhancement of the current GPS to raise it to a 1.5G GNSS), or GPS-III, have commenced. However, GPS-III satellites are unlikely to be launched before 2012, several years after Galileo is scheduled to be fully operational. GLONASS GLONASS is a GNSS developed by the former U.S.S.R. during the Cold War, and like GPS was initially intended primarily for military applications. Although the signal and code structure is different from that of GPS (see, e.g. Seeber, 2003, and Table 3) there are also many similarities. For example, GLONASS is a dualfrequency satellite-based radiolocation system that permits pseudorange and carrier phase measurements to be made. In addition, both point positioning
35

and relative positioning is possible, with similar levels of accuracy to GPS. Galileo Since the late 1990s the European Union has been promoting the development of an independent GNSS under ‘civilian control’. The primary motivation has been to address the needs of the transportation sector (particularly civil aviation) for a GNSS with guaranteed integrity. Galileo is the concept that has now been approved for development and deployment by 2008. The signal and code structure is far more complex than for GPS, consisting of up to ten trackable signals and codes. There are common signals with GPS L1 and L5, permitting a significant degree of ‘interoperability’ (i.e. integrated GPS/Galileo receivers able to track signals from both constellations). Unfortunately there are no plans for a Galileo signal that overlays the GPS L2 signal. Increasing the number of signals-in-space (from nearly 60 orbiting satellites) is to be applauded, as this leads to greater availability, particularly in urban environments. The combination of GPS and Galileo would certainly benefit many users (except perhaps for those using very low cost receivers) . Nevertheless, global and regional augmentation services would still be expected to address concerns for availability, integrity and accuracy of GPS, GLONASS and Galileo INFRARED COMMUNICATION Infrared (IR) light is electromagnetic radiation with longer wavelengths than those of visible light, extending from the nominal red edge of the visible spectrum at 0.74 micrometres (µm) to 0.3 mm. This range of wavelengths corresponds to a frequency range of approximately 430 down to 1 THz, and includes most of the thermal radiation emitted by objects near room temperature. Infrared light is emitted or absorbed by molecules when they change their rotational-vibrational movements. Infrared light is used in industrial, scientific, and medical applications. Night-vision devices using infrared illumination allow people or animals to be observed without the observer being detected. In astronomy, imaging at infrared wavelengths allows observation of objects obscured by interstellar dust. Infrared imaging cameras are used to detect heat loss in insulated systems, to observe changing blood flow in the skin, and to detect overheating of electrical apparatus. Infrared imaging is used extensively for military and civilian purposes. Military applications include target acquisition, surveillance, night vision, homing and tracking. Non-military uses include thermal efficiency analysis, environmental

36

monitoring, industrial facility inspections, remote temperature sensing, shortranged wireless communication, spectroscopy, and weather forecasting. Infrared astronomy uses sensor-equipped telescopes to penetrate dusty regions of space, such as molecular clouds; detect objects such as planets, and to view highly red-shifted objects from the early days of the universe IR data transmission is also employed in short-range communication among computer peripherals and personal digital assistants. These devices usually conform to standards published by IrDA, the Infrared Data Association. Remote controls and IrDA devices use infrared light-emitting diodes (LEDs) to emit infrared radiation which is focused by a plastic lens into a narrow beam. The beam is modulated, i.e. switched on and off, to encode the data. The receiver uses a silicon photodiode to convert the infrared radiation to an electric current. It responds only to the rapidly pulsing signal created by the transmitter, and filters out slowly changing infrared radiation from ambient light. Infrared communications are useful for indoor use in areas of high population density. IR does not penetrate walls and so does not interfere with other devices in adjoining rooms. Infrared is the most common way for remote controls to command appliances. Infrared remote control protocols like RC-5, SIRC, are used to communicate with infrared. Free space optical communication using infrared lasers can be a relatively inexpensive way to install a communications link in an urban area operating at up to 4 gigabit/s, compared to the cost of burying fiber optic cable. Infrared lasers are used to provide the light for optical fiber communications systems. Infrared light with a wavelength around 1,330 nm (least dispersion) or 1,550 nm (best transmission) are the best choices for standard silica fibers. IR data transmission of encoded audio versions of printed signs is being researched as an aid for visually impaired people through the RIAS (Remote Infrared Audible Signage) project. SMART CARD A smart card, chip card, or integrated circuit card (ICC) is any pocket-sized card with embedded integrated circuits. Smart cards are made of plastic, generally polyvinyl chloride, but sometimes polyethylene terephthalate based polyesters, acrylonitrile butadiene styrene or polycarbonate. Smart cards can provide identification, authentication, data storage and application processing. Smart cards may provide strong security authentication for single sign-on (SSO) within large organizations
37

SQL, SQL (Structured Query Language) is a special-purpose programming language designed for managing data held in a relational database management systems (RDBMS) SQL is designed for a specific purpose: to query data contained in a relational database. SQL is a set-based, declarative query language, not an imperative language like C or BASIC. However, there are extensions to Standard SQL which add procedural programming language functionality, such as control-of-flow constructs. PROS:  Easy to learn Everybody who has hands on experience with PL/SQL must agree with this; PL/SQL is such English-like language that it makes it very easy to learn. You probably just need between 3 to 6 months to become a PL/SQL newbie, and then you can start coding. As a result, you can easily find junior PL/SQL developers in the market. If you can’t; you can easily obtain them through training!  Better performance in manipulating data in the Oracle database PL/SQL engine is developed by Oracle and is inside of the Oracle database. Moreover it has better integration with SQL and it is no secret to say PL/SQL can provide better performance than any other language. Another reason is, your PL/SQL code is close to your data, this saves the interaction time between application and database. If you were to use any other language, application is out of the database in most cases. If you need to transfer bulks of data between application and database, PL/SQL can help you to save time significantly.  Good Security Because PL/SQL code is inside the Oracle database, and is normally protected very well by most enterprises, PL/SQL language also provides good security to your business logic!  Better capacity to use Oracle advanced features Oracle has many advanced features (such as advanced queue, advanced replication, DBMS jobs) and many of these features are provided through Oracle’s supplied PL/SQL packages. It gives you great advantage by using PL/SQL as your application language if you want to benefit from these features. This also provides you with good productivity. CONS:  Not easy to become a senior developer
38

Yes, it is quite easy to become a junior PL/SQL developer, but this is not so much the case for becoming a senior. The reason for this is a senior PL/SQL developer is expected to have great in depth knowledge about the PL/SQL language itself. You pretty much have to be a database expert, know some other coding language at a basic level at least, have hands on experience on handling big volumes of data; all these need time to learn, and sometimes you need the opportunity to gather experience. Props for the senior PL/SQL developers amongst us!  Higher maintenance cost Yes, PL/SQL can provide good results in the developing phase, but the maintenance cost is relatively high. One of the reasons is it is quite common for people to write business logic and SQL together in PL/SQL code, this however results in bad readability. Also, compared to object-oriented language, PL/SQL does not have concepts like abstraction and inheritance, which causes PL/SQL language to need more efforts to deliver a similar function.  There is a limit to database-centric applications Yes, PL/SQL can write non-database-centric applications, but it is not recommended. There are many better options in this area as PL/SQL is designed for database-centric applications; this is PL/SQL’s true domain. PL/SQL is nothing if it would be defeated in this domain; fortunately it is not likely to be happened so far. Suggested is to have a good analysis of the pros and cons of PL/SQL and determine if the pros are strong enough to overwrite the disadvantages. Many firms definitely see the value in using PL/SQL today.

39

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close