Internet has change the world. It has made the work more simple. The Internet technology is global technology and has connected many computer networks with each other.It has huge networks that has connected to each other like public,private, business and government.The internet carries a huge resources of information and we can get just with a click of your mouse. Internet has unlimited resources and any thing which we want we can get easily.(WWW) this we have to write before the name of the website. We used to get reply from HTTP(Hper text Tranfe protocol).We can mail to any one using E-mail. Besides this the additional services that it give is online chat,we can transfer files from one computer to another.We can play games ,We can do online business, videos can watched using different sites.We can upload and download images.Not only that but we can chat and see the the face of each other using VOIP(Video over internet protocol).
The invention of internet was done in late 1960s,but it came into existence in 1990s and at this internet was not at unaware. Every body in the world was aware of this Latest technology. After this now world is using internet and its use is increasing day by day. Now,with the internet our lots of work is depend on this.Many search engines are there which make our work more easy. Yahoo and Google are most used search engines.People from the world uses Internet and take advantage of this useful technology.
Many uses are there hile using the internet. One of the main and simple use is E-mail. We can send letter,images to another in just a click . It make our work more easy and it takes less time. Sharing document from one place to another.IT(information Technology) has more use on internet.Internet can be used anywhere ranging from Mobile phone ,laptop and desktop palmtop. Many mobiles phone are made which supports GPRS as well as WI-FI features. This make even more easy, because we don;t have to go and we can have it here write now.We can get internet at such a small devices is also a new technology that has been acheived and its very handy.
Internet has now become the large market, and for the sdvertising company its very low costing business and this is what we call e-commerce.It is very faster way to spread the information at a very convienable cost.Many different features like shopping online. If we want to buy the any product of any company we can go to desired website and order our product.If is very convienent way of shopping. We can also download music, pictures and many other thing online using internet.There are social Networking sites like Facebook,Orkut My space etc from where we can make friends and share many things using internet. Internet can broadly classified and it has various different activites and many other features that make it so comfortable to work on internet.
Many users on the social Networking sites on the internet are of age ranging from 13 to 25 years. Low cost and at the same time sharing and gaining knowledge , the internet technology has made the dream come true.The chat rooms has made the use of internet more advantageous. We can chat online even if we are doing work at the computer using Yahoo messenger, Gtalk and skype .Using this we can share messages more quickly than E-mail.Web Browser like Firebox,opera , safari and Google chrome make the use of internet more confortable.
Many people uses internet mainly for sharing information ,chating ans sharing files and documents.Many browser as mentioned above makes work easy.In IT field the use of internet is so much. IT field user uses internet for sharing, transfering and reteriving information.Using web, it is more easily to publised and share our ideas in a very convienent way.Blogs are the simple way where we can share our ideas .Blogs are the online diary where we can express our ideas in the better way.Many users have an account and used to share our ideas.We can even make our website on the web.
Communications with different has become more easier and with the introduction of E-mail,the communication has become more simple.We can send one E-mail to many different address at same time. Internet telphony is another mode of communication and its is also the creation of internet.
But there are making disadvantages also while using internet.HACKING is done and this is most disadvantage of this technology. Many Hackers are there who hack the websites and even personal information. We should be aware of this should use in proper way and save ourself from hacking.
Artificial Intelligence:when we hear the name,a robot from a film, looking like a man[like the one in i-Robot movie] flashes before our eyes.So what is artificial intelligence? Experts seem to give a multitude of response to this question.
Some say' it is the ability of machines to do things that is exclusive to humans' yet some say 'it is the ability of a computer or computer assisted machine to do things that require human intervention'
Whatever the definition of artificial intelligence may be, the bottom line is we can create non-human things that can thing/do things on its own.There are many simple applications where artificial intelligence[A.I.] is used,for example our cellular mobile network,security systems,autonomous machines used in construction and machinery,etc.,
But the current applications of A.I. are in the feild of autonomous robots,autonomous flying machines,Face/speech recognition,autonomous machines that don't need human supervision,etc.,

Since the development of the digital computer in the dawn of 1940s, it has been shown that computers can be programmed to carry out very complex tasks—as, for example, discovering proofs for mathematical theorems or playing chess,etc.,. Still, despite continuing advances in computer processing speed and memory capacity, there are as yet no programs that can match human flexibility in tasks under wider domain requiring much everyday knowledge.
A human can do mathematic calculations, at the same time he can recogonise faces,sounds,smell,he can do any physical work when he is aptly trained.But this is hard with A.I. because seperate programs(complex ones) should be written for each operations so creating artificial intelligence was impossible but scientists turned to nature to achieve A.I.Alan Turing,one of the prominent scientist in 1930's who was involved in research in feild of A.I. said "Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child's? If this were then subjected to an appropriate course of education one would obtain the adult brain."This idea is extensively used in today's upcoming A.I. where the A.I. is programed to learn.
So how do we create intelligence?
Intelligence can be summarised as
- Learning
- Reasoning
- Problem solving
- Perception
- Language
if we are able to simulate these, we can say that we've created a reasonable A.I.The creation of A.I.is through neural networks/parallel computing methods.Some of intriguing applications that people can experience is Chatter bots.They are programs/softwares that can recogonise and respond to human speech,usually through text but advances in the field allow people to actually converse with chatter bots thro' voice.One such chatter bot[understands whatever you type and speaks accordingly] is ALAN ,it is bein developed in a research facility in Israel,They've named it 'Ai Research'
Here is a link that will enable you to talk to ALAN(you would be amazed by the fluency of ALAN in uderstanding the questions or things you say)
http://www.a-i.com/show_tree.asp?id=124&level=3&root=115
Other applications that are used today are flight control and stabilization,A.I. in games,A.I. in robots and UAVs,A.I for autonomous vehicles like cars and other transport vehicles.
It is a set or arrangement of procedures or programs so that together form a single unit. A set of facts, principles and rules are classified and arranged in an orderly fashion showing a logical plan at the junction of the parties. One method, plan or classification procedure to do something. It is also a set or arrangement of elements for a predefined target in the processing of Information. This is carried out taking into account certain principles:
* Must be presented and understood the information dominance of a problem.
* Define the functions to be performed by the Software.
* Represent software behavior to consequences of external events.
* Break in a hierarchical fashion models representing the information, functions and behavior.
The process should start from the essential information to the implementation details.
The analysis function may be to support business activities, or develop a product that can be sold for profit. To achieve this goal, a computer-based system makes use of six (6) elements:
* Software, which are computer programs, data structures and documentation giving effect to the methodology or controls logistics requirements of the program.
* Hardware, electronic and electromechanical devices that provide computing power and features fast, accurate and effective (computers, sensors, machinery, pumps, readers, etc..), Which provide an external function within the system.
* Staff are operators or direct users of the tools of the system.
* Database, a large collection of information organized and linked to the system that are accessed through the Software.
* Documentation, manuals, forms, and other descriptive information that describes or gives instructions for the use and operation of the program.
* Procedures or steps that define the specific use of each of the elements or components of the system and its rules of operation and maintenance.
A System Analysis is carried out taking into account the following objectives in mind:
* Identify customers' requirements.
* Assess client that has the concepts of the system to establish its viability.
* Perform technical analysis and economic development.
* Assign functions to hardware, software, personnel, database and other elements of the system.
* Set the constraints of budgets and timing.
* Create a system definition that forms the foundation of all engineering work.
To achieve these goals requires having a great knowledge and mastery of the Hardware and Software, and the Human Engineering (Management and Personnel Administration), and database administration.
Objectives of Analysis
Identifying Needs: It is the first step of system analysis in this process analyst meets with the client and / or user (a representative institutional, departmental or individual customer), and identify the overall goals, analyzes the client's perspectives, needs and requirements on the timing and budget lines, marketing and other points that may help identify and develop the project.
Some authors tend to call this part ¨ ¨ Requirements Analysis and divide it into five parts:
* Recognition of the problem.
* Assessment and Synthesis.
* Modeling.
* Specification.
* Review.
Before meeting with the analyst, the client is preparing a project concept paper, although it is recommended that this be developed for client communication - analyst, as the client do so anyway would only be altered over the identification of needs.
Feasibility Study
Many times when undertaking the development of a systems project resources and time are not realistic for its materialization without economic losses and professional frustration. The feasibility and risk analysis are related in many ways, if the project risk is high, the viability of producing quality software is reduced, but must take into account four main areas of interest:
Economic viability
An assessment of development costs, compared to net income or profits from the product or system developed.
Technical Analysis
A study of features, performance and restrictions that may affect the achievement of an acceptable system.
Legal Analysis
Is to determine any possible infringement, violation or liability that may be incurred in developing the system.
Alternatives. An assessment of alternative approaches to product development or system.
The feasibility study can be documented as a separate report to senior management.
Economic and Technical Analysis
The economic analysis includes what we call, the cost analysis - benefits, means an assessment of the economic investment compared to the benefits to be gained in marketing and usability of the product or system.
Many times in the development of Computer Systems these are intangible and it is somewhat difficult to assess, this varies according to the characteristics of the system. Cost analysis - benefits is a very important phase of it depends the possibility of development of the Project.
In technical analysis, the analyst evaluates the system engineering principles and simultaneously collects information on the performance, reliability, maintenance and productivity features.
The results of technical analysis are the basis for determining whether to continue or abandon the project if there are risks that does not work, does not have the desired performance, or if the pieces do not fit together.
Modeling System Architecture
When we want to better understand what we're going to build in the case of buildings, tools, aircraft, machines, creates an identical model, but on a smaller scale (smaller).
But when that which is a software build, our model must take a different form, should represent all the functions and subfunctions of a system. The models focus on what to do in the system as it does, these models may include graphic notation, Information and System Behavior.
All computer-based systems can be modeled as a transformation of information using an architecture of input and output type.
A supercomputer is the type of computer more powerful and faster than exists now. How are you machines are designed to process huge amounts of information quickly and are dedicated to a specific task or application is beyond the particular use, rather they engage in:
1. Search oil fields with large seismic database.
2. The study and prediction of tornadoes.
3. The study and prediction of anywhere in the world.
4. The development of models and projects for the creation of aircraft, flight simulators.
We must also add that supercomputers are a relatively new technology, hence its use has not crowded and is sensitive to changes. It is for this reason that its price is very high with over 30 million and the number that is produced each year is small.
Concept
Supercomputers are the type of more powerful computers and faster than existing in a given time. Are large, the biggest among its peers. They can process huge amounts of information quickly and can execute millions of instructions per second, is intended for a specific task and have a very large storage capacity. They are also the most expensive bearing a cost that could exceed $ 30 million. Because of its high manufacturing cost is very little for a year, there are even some that are manufactured only by request.
They have a special temperature control to dissipate the heat that some components can reach. It acts as an arbiter of all applications and controls access to all files, so does the input and output operations. The user is directed to the central computer of the organization when required processing support.
They are designed for multiprocessing systems, the CPU is the processing center and can support thousands of users online. The number of processors that can have a supercomputer depends mainly on the model, can have from about 16 processors up to 512 (such as an NEC SX-4 of 1997) and more. As belonging to the class of supercomputers can be named: The Cray 1, Cyber, Fujitsu, etc.
Some History
- MANCHESTER MARK (1948)
The first supercomputer British laid the foundations for many concepts still used today.
In modern terms had a RAM (random access memory) of just 32 positions or 'words'. Each word consisted of 32 bits (binary digits), which means that the machine had a total of 1024 bits of memory.
The RAM technology is based on the cathode ray tube (CRT). The CRTs were used to store bits of data on phosphorus-laden areas of the screen, appearing as a series of glowing dots on it. The electron beam of the CRT can efficiently handle this load and write a 1 or 0 and read it later as required.
In mid-1950 Britain was behind the U.S. in the production of high performance computers. In the fall of 1956 Tom Kilburn (co-designer of the Manchester Mark I) had initiated an effort known as the computer MUSE (microsecond).
The design specifications included the desire for speed next to a command instruction per microsecond and the need to add a large number of peripherals of various types. They also required storage capacity would have immediate access than any that was available then.
Special techniques were used that eventually included what are currently known as multiprogramming, scheduling, spooling, interruptions, pipelining, interleaved storage, autonomous transfer units, paging and virtual storage, techniques not yet established.
In 1959 the computer had been renamed as the Atlas and was later developed as a union between Manchester University and Ferranti company of Tom Kilburn. Atlas was launched on December 7, 1962. It was felt it would be the most powerful computer in the world. Was 80 times more powerful than Meg / Mercury and 2400 times more powerful than the Mark 1.
The first IBM computer in Daresbury, an IBM 1800, arrived in June 1966 and acted as a control computer and data transfer for the synchrotron NINA, then the main experimental service. It was quickly followed by the first IBM mainframe computer at Daresbury, the IBM 360/50 which started service in July 1966. This was replaced by an IBM 360/65 in November 1968.
During the early years the main task was to provide computing power to groups of High Energy Physics Laboratory working. The computer was very different in those days. The usual way to tell the computer to do was work on punch cards (although some stalwarts still insisted to the paper tape of 5 holes). Typically one preparing a work on punch cards and placed them on a high slider. Then an operator would take the load of punch cards and slide off the line printer that had occurred.
The loading and unloading time was measured at least tens of minutes. The mean time between failures towards the end of the 60 was a day. However these computer failures were 'ignored' by users who were waiting for the slide to reappear, and scored only a slight delay in the operation speed. The NAS/7000 (an IBM 'Clonico') was installed in June 1981.
This gave a huge increase in power and accuracy compared to previous systems.
The Cray 1 supercomputer was the first "modern".
One of the reasons that the Cray-1 was so successful was that I could perform more than one hundred million arithmetic operations per second (100 Mflop / s).
If today, following a conventional process, try to find a computer in the same speed using PCs, need to connect 200 of them, or you could simply buy 33 Sun4s.
CONVEX C-220 AND THE REVOLUTION UNIX
The arrival of UNIX qualitatively changed the way scientists addressing computer problems. First is a flexible way of providing power to your computer, rapidly changing hardware market and a crucial way to the changing requirements of scientific applications to users. New components can be added simply, or increased power as needed.
Intel
The 64-node Intel iPSC/860 has called RX. Each node has a clock of 40 MHz and 16 Mbytes of memory. The direct-connect hardware allows data transfer from node to node 2.8 Mbytes / second. There are 12 Gbytes of disk locally attached Ethernet connections to a workstation Sun-670MP User Access.
A maximum single-node performance of 40 MFLOPS offers a total of more than 2.5 Gflops for the whole machine. The software to make programming easier include: Fortran and C compilers through.
In 1995
Set of 26 "workstations" running under the UNIX system and software capable of running independently or work with data transfer over a high speed switch.
Computer IBM SP2
It consists of 24 nodes P2SC (Super Chip), plus another 2 processors width oldest node located in two racks. Only the second of which is shown in the photograph. Each node has a clock of 120 MHz and 128 Mbytes of memory. Two New High Performance Switches (TV3) are used to connect nodes together. Data storage consists of 40 Gbytes of fast disks locally attached Ethernet and FDDI networks for user access.
A maximum single-node performance of 480 MFLOPS offers a total of over 12 Gflops for the whole machine.
A PowerPC RS/6000 workstation is connected to the SP2 to the system of monitoring and management of hardware and software.
Supercomputer SOVIET
Just as there was a space race and arms should not surprise anyone who had a career also supercomputer. The high-performance software for the Soviet Union were, of course, taken in secret. The information here is, and sadly probably still fairly vague.
- FAMILY BESM
Computer numerical series of "high performance"
The BESM-6 was designed in 1965 by a group of engineers working on the SALebedev Institute of precise mechanics and computer equipment (ITMiVT in Russia).
Production started in 1967 by the "Ground SAM (SAM stands for" Computer-Analytical Machines Machines") in Moscow. The basic configuration includes CPU, 192 KB of core memory, magnetic drums, tape drive patented magnetic, teletype machines, typewriters (with parallel interface), alphanumeric printers and card readers and recorders / paper tape. There were about 350 copies to the early 80s. The latest configurations including standard strips 1 / 2 inch magnetic disk drives from IBM clone, VDUs in series, plotters, etc., mostly imported or cloned the original hardware.
Today the design of supercomputers is based on 4 important technologies:
- Technology of vector registers, created by Cray, the father of supercomputing, who invented and patented several technologies that led to the creation of machines ultra-fast computer. This technology allows the execution of many arithmetic operations in parallel.
- The system known as M.P.P. the initials of Massively Parallel Processors or massively parallel processing, which involves the use of hundreds and sometimes thousands of microprocessors tight.
- The technology of distributed computing: clusters of computers for general use and relatively inexpensive, local networks interconnected by low latency and high bandwidth.
- Quasi-Supercomputer: Recently, with the popularization of the Internet, have emerged distributed computing projects in which special software exploit the idle time of thousands of personal computers to perform large tasks for a low cost. Unlike the last three categories, the software that runs on these platforms must be able to divide the tasks into separate calculation blocks are not assembled or communicated by several hours. In this category stand out
HIGHLIGHTS
- Speed of Processing: Billions of floating point instructions per second.
- Users at a time: Up to thousands in large network environment.
- Size: Require special facilities and air conditioning industry.
- Ease of use: Only for specialists.
- Clients usual: Large research centers.
- Social Penetration: Practically nil.
- Social impact: Almost zero but supercomputers could not do things like predicting the weather a decade away or solve very complex calculations that can not be solved by hand.
- Park installed: Less than a thousand worldwide.
- Cost: up to tens of millions each.
Conclusion
Supercomputer. Computer calculation capabilities far beyond those common to the same period of manufacture.
They are very expensive, so its use is limited to military agencies, government and businesses. They usually have scientific applications, especially real-life simulations.
Some are known Blue Gene supercomputers, Seymour Cray, Deep Blue, Earth Simulator, MareNostrum, etc..
Supercomputers are often planned according to some of the following four models:
* Vector registers.
* System M.P.P. or Massively Parallel Processors (Massively Parallel Processors)
* Distributed computing technology.
* Quasi-Supercomputer.
The most common uses for supercomputers include weather forecasting, complex 3D animations, fluid dynamic calculations, nuclear research, oil exploration, etc.
More Articles …
Subcategories
Web Hosting
Web Hosting is a service offered by web hosting providers to the individuals and organizations to make their websites accessible on the internet. Depending on the requirement, one can avail different types of web hosting such as shared hosting, dedicated hosting, virtual private hosting, cloud hosting etc.
Page 176 of 193