It is a set or arrangement of procedures or programs so that together form a single unit. A set of facts, principles and rules are classified and arranged in an orderly fashion showing a logical plan at the junction of the parties. One method, plan or classification procedure to do something. It is also a set or arrangement of elements for a predefined target in the processing of Information. This is carried out taking into account certain principles:
* Must be presented and understood the information dominance of a problem.
* Define the functions to be performed by the Software.
* Represent software behavior to consequences of external events.
* Break in a hierarchical fashion models representing the information, functions and behavior.
The process should start from the essential information to the implementation details.
The analysis function may be to support business activities, or develop a product that can be sold for profit. To achieve this goal, a computer-based system makes use of six (6) elements:
* Software, which are computer programs, data structures and documentation giving effect to the methodology or controls logistics requirements of the program.
* Hardware, electronic and electromechanical devices that provide computing power and features fast, accurate and effective (computers, sensors, machinery, pumps, readers, etc..), Which provide an external function within the system.
* Staff are operators or direct users of the tools of the system.
* Database, a large collection of information organized and linked to the system that are accessed through the Software.
* Documentation, manuals, forms, and other descriptive information that describes or gives instructions for the use and operation of the program.
* Procedures or steps that define the specific use of each of the elements or components of the system and its rules of operation and maintenance.
A System Analysis is carried out taking into account the following objectives in mind:
* Identify customers' requirements.
* Assess client that has the concepts of the system to establish its viability.
* Perform technical analysis and economic development.
* Assign functions to hardware, software, personnel, database and other elements of the system.
* Set the constraints of budgets and timing.
* Create a system definition that forms the foundation of all engineering work.
To achieve these goals requires having a great knowledge and mastery of the Hardware and Software, and the Human Engineering (Management and Personnel Administration), and database administration.
Objectives of Analysis
Identifying Needs: It is the first step of system analysis in this process analyst meets with the client and / or user (a representative institutional, departmental or individual customer), and identify the overall goals, analyzes the client's perspectives, needs and requirements on the timing and budget lines, marketing and other points that may help identify and develop the project.
Some authors tend to call this part ¨ ¨ Requirements Analysis and divide it into five parts:
* Recognition of the problem.
* Assessment and Synthesis.
* Modeling.
* Specification.
* Review.
Before meeting with the analyst, the client is preparing a project concept paper, although it is recommended that this be developed for client communication - analyst, as the client do so anyway would only be altered over the identification of needs.
Feasibility Study
Many times when undertaking the development of a systems project resources and time are not realistic for its materialization without economic losses and professional frustration. The feasibility and risk analysis are related in many ways, if the project risk is high, the viability of producing quality software is reduced, but must take into account four main areas of interest:
Economic viability
An assessment of development costs, compared to net income or profits from the product or system developed.
Technical Analysis
A study of features, performance and restrictions that may affect the achievement of an acceptable system.
Legal Analysis
Is to determine any possible infringement, violation or liability that may be incurred in developing the system.
Alternatives. An assessment of alternative approaches to product development or system.
The feasibility study can be documented as a separate report to senior management.
Economic and Technical Analysis
The economic analysis includes what we call, the cost analysis - benefits, means an assessment of the economic investment compared to the benefits to be gained in marketing and usability of the product or system.
Many times in the development of Computer Systems these are intangible and it is somewhat difficult to assess, this varies according to the characteristics of the system. Cost analysis - benefits is a very important phase of it depends the possibility of development of the Project.
In technical analysis, the analyst evaluates the system engineering principles and simultaneously collects information on the performance, reliability, maintenance and productivity features.
The results of technical analysis are the basis for determining whether to continue or abandon the project if there are risks that does not work, does not have the desired performance, or if the pieces do not fit together.
Modeling System Architecture
When we want to better understand what we're going to build in the case of buildings, tools, aircraft, machines, creates an identical model, but on a smaller scale (smaller).
But when that which is a software build, our model must take a different form, should represent all the functions and subfunctions of a system. The models focus on what to do in the system as it does, these models may include graphic notation, Information and System Behavior.
All computer-based systems can be modeled as a transformation of information using an architecture of input and output type.
A supercomputer is the type of computer more powerful and faster than exists now. How are you machines are designed to process huge amounts of information quickly and are dedicated to a specific task or application is beyond the particular use, rather they engage in:
1. Search oil fields with large seismic database.
2. The study and prediction of tornadoes.
3. The study and prediction of anywhere in the world.
4. The development of models and projects for the creation of aircraft, flight simulators.
We must also add that supercomputers are a relatively new technology, hence its use has not crowded and is sensitive to changes. It is for this reason that its price is very high with over 30 million and the number that is produced each year is small.
Concept
Supercomputers are the type of more powerful computers and faster than existing in a given time. Are large, the biggest among its peers. They can process huge amounts of information quickly and can execute millions of instructions per second, is intended for a specific task and have a very large storage capacity. They are also the most expensive bearing a cost that could exceed $ 30 million. Because of its high manufacturing cost is very little for a year, there are even some that are manufactured only by request.
They have a special temperature control to dissipate the heat that some components can reach. It acts as an arbiter of all applications and controls access to all files, so does the input and output operations. The user is directed to the central computer of the organization when required processing support.
They are designed for multiprocessing systems, the CPU is the processing center and can support thousands of users online. The number of processors that can have a supercomputer depends mainly on the model, can have from about 16 processors up to 512 (such as an NEC SX-4 of 1997) and more. As belonging to the class of supercomputers can be named: The Cray 1, Cyber, Fujitsu, etc.
Some History
- MANCHESTER MARK (1948)
The first supercomputer British laid the foundations for many concepts still used today.
In modern terms had a RAM (random access memory) of just 32 positions or 'words'. Each word consisted of 32 bits (binary digits), which means that the machine had a total of 1024 bits of memory.
The RAM technology is based on the cathode ray tube (CRT). The CRTs were used to store bits of data on phosphorus-laden areas of the screen, appearing as a series of glowing dots on it. The electron beam of the CRT can efficiently handle this load and write a 1 or 0 and read it later as required.
In mid-1950 Britain was behind the U.S. in the production of high performance computers. In the fall of 1956 Tom Kilburn (co-designer of the Manchester Mark I) had initiated an effort known as the computer MUSE (microsecond).
The design specifications included the desire for speed next to a command instruction per microsecond and the need to add a large number of peripherals of various types. They also required storage capacity would have immediate access than any that was available then.
Special techniques were used that eventually included what are currently known as multiprogramming, scheduling, spooling, interruptions, pipelining, interleaved storage, autonomous transfer units, paging and virtual storage, techniques not yet established.
In 1959 the computer had been renamed as the Atlas and was later developed as a union between Manchester University and Ferranti company of Tom Kilburn. Atlas was launched on December 7, 1962. It was felt it would be the most powerful computer in the world. Was 80 times more powerful than Meg / Mercury and 2400 times more powerful than the Mark 1.
The first IBM computer in Daresbury, an IBM 1800, arrived in June 1966 and acted as a control computer and data transfer for the synchrotron NINA, then the main experimental service. It was quickly followed by the first IBM mainframe computer at Daresbury, the IBM 360/50 which started service in July 1966. This was replaced by an IBM 360/65 in November 1968.
During the early years the main task was to provide computing power to groups of High Energy Physics Laboratory working. The computer was very different in those days. The usual way to tell the computer to do was work on punch cards (although some stalwarts still insisted to the paper tape of 5 holes). Typically one preparing a work on punch cards and placed them on a high slider. Then an operator would take the load of punch cards and slide off the line printer that had occurred.
The loading and unloading time was measured at least tens of minutes. The mean time between failures towards the end of the 60 was a day. However these computer failures were 'ignored' by users who were waiting for the slide to reappear, and scored only a slight delay in the operation speed. The NAS/7000 (an IBM 'Clonico') was installed in June 1981.
This gave a huge increase in power and accuracy compared to previous systems.
The Cray 1 supercomputer was the first "modern".
One of the reasons that the Cray-1 was so successful was that I could perform more than one hundred million arithmetic operations per second (100 Mflop / s).
If today, following a conventional process, try to find a computer in the same speed using PCs, need to connect 200 of them, or you could simply buy 33 Sun4s.
CONVEX C-220 AND THE REVOLUTION UNIX
The arrival of UNIX qualitatively changed the way scientists addressing computer problems. First is a flexible way of providing power to your computer, rapidly changing hardware market and a crucial way to the changing requirements of scientific applications to users. New components can be added simply, or increased power as needed.
Intel
The 64-node Intel iPSC/860 has called RX. Each node has a clock of 40 MHz and 16 Mbytes of memory. The direct-connect hardware allows data transfer from node to node 2.8 Mbytes / second. There are 12 Gbytes of disk locally attached Ethernet connections to a workstation Sun-670MP User Access.
A maximum single-node performance of 40 MFLOPS offers a total of more than 2.5 Gflops for the whole machine. The software to make programming easier include: Fortran and C compilers through.
In 1995
Set of 26 "workstations" running under the UNIX system and software capable of running independently or work with data transfer over a high speed switch.
Computer IBM SP2
It consists of 24 nodes P2SC (Super Chip), plus another 2 processors width oldest node located in two racks. Only the second of which is shown in the photograph. Each node has a clock of 120 MHz and 128 Mbytes of memory. Two New High Performance Switches (TV3) are used to connect nodes together. Data storage consists of 40 Gbytes of fast disks locally attached Ethernet and FDDI networks for user access.
A maximum single-node performance of 480 MFLOPS offers a total of over 12 Gflops for the whole machine.
A PowerPC RS/6000 workstation is connected to the SP2 to the system of monitoring and management of hardware and software.
Supercomputer SOVIET
Just as there was a space race and arms should not surprise anyone who had a career also supercomputer. The high-performance software for the Soviet Union were, of course, taken in secret. The information here is, and sadly probably still fairly vague.
- FAMILY BESM
Computer numerical series of "high performance"
The BESM-6 was designed in 1965 by a group of engineers working on the SALebedev Institute of precise mechanics and computer equipment (ITMiVT in Russia).
Production started in 1967 by the "Ground SAM (SAM stands for" Computer-Analytical Machines Machines") in Moscow. The basic configuration includes CPU, 192 KB of core memory, magnetic drums, tape drive patented magnetic, teletype machines, typewriters (with parallel interface), alphanumeric printers and card readers and recorders / paper tape. There were about 350 copies to the early 80s. The latest configurations including standard strips 1 / 2 inch magnetic disk drives from IBM clone, VDUs in series, plotters, etc., mostly imported or cloned the original hardware.
Today the design of supercomputers is based on 4 important technologies:
- Technology of vector registers, created by Cray, the father of supercomputing, who invented and patented several technologies that led to the creation of machines ultra-fast computer. This technology allows the execution of many arithmetic operations in parallel.
- The system known as M.P.P. the initials of Massively Parallel Processors or massively parallel processing, which involves the use of hundreds and sometimes thousands of microprocessors tight.
- The technology of distributed computing: clusters of computers for general use and relatively inexpensive, local networks interconnected by low latency and high bandwidth.
- Quasi-Supercomputer: Recently, with the popularization of the Internet, have emerged distributed computing projects in which special software exploit the idle time of thousands of personal computers to perform large tasks for a low cost. Unlike the last three categories, the software that runs on these platforms must be able to divide the tasks into separate calculation blocks are not assembled or communicated by several hours. In this category stand out
HIGHLIGHTS
- Speed of Processing: Billions of floating point instructions per second.
- Users at a time: Up to thousands in large network environment.
- Size: Require special facilities and air conditioning industry.
- Ease of use: Only for specialists.
- Clients usual: Large research centers.
- Social Penetration: Practically nil.
- Social impact: Almost zero but supercomputers could not do things like predicting the weather a decade away or solve very complex calculations that can not be solved by hand.
- Park installed: Less than a thousand worldwide.
- Cost: up to tens of millions each.
Conclusion
Supercomputer. Computer calculation capabilities far beyond those common to the same period of manufacture.
They are very expensive, so its use is limited to military agencies, government and businesses. They usually have scientific applications, especially real-life simulations.
Some are known Blue Gene supercomputers, Seymour Cray, Deep Blue, Earth Simulator, MareNostrum, etc..
Supercomputers are often planned according to some of the following four models:
* Vector registers.
* System M.P.P. or Massively Parallel Processors (Massively Parallel Processors)
* Distributed computing technology.
* Quasi-Supercomputer.
The most common uses for supercomputers include weather forecasting, complex 3D animations, fluid dynamic calculations, nuclear research, oil exploration, etc.
Most of you would have heard the word 'Motion Capture'(aka MoCap),ever wondered what is it?.It is the process of recording the motion/movement of realtime subject onto a digital model.A more simpler defenition would be the process of transfering the actions done by realtinme subject onto a computer generated model.
How was this developed?
This was an idea that developed from the method used by 2d animators,who would record the perfomance of an actor,then using the video and analysing it frame by frame they would try to recreate the same motion in their animations(drawings).There is not much difference between this method and motion capture,the only difference being - motion capture uses computer to capture the perfomance in realtime 3D.
So what are the applications of motion capturing?
Motion capture involves capturing the movements of the actor or the facial expressions,camera movements,etc.,One of the major areas where motion capture is used is in animating 3D model,in games for realistic game experience,in virtual reality systems like diving simulation,it is also used in medical field to train surgeons and to assist in performing surgery using precition robotic arms, in biomechanical studies, to study human motion for applications of it in realtime mechanics,etc.,
How do they do it?
Well there are quite a lot of ways to achieve motion capturing,one of the prominent ways is to use markers.Markers are photo reflective/emittive objects(small in size) placed at key points such that with minimal markers, maximum motion can be captured,usually markers are placed in joints and places that need referencing.There markers are tracked by sensors,there by determining the position of each marker in 3D space.
The Pro's and Con's of motion capturing:
The main advantage of this method is that it saves time, since it is easier to animate a 3D model once motion capturing is employed.Also for some applications, realtime results can be obtained(the movements of a person triggers actions in computer model in realtime).Longer, more dangerous shots can be easily animated and created using motion capture technology.
Though there are a good set of advantages in using motion capture,there are some setbacks-the cost for setting up motion capture,the equipment are costly moreover handling motion capture data is tedious if the process was faulty or if the proportions of the 3D model and real life actor doesn't match.
Wireless technology is new technology where we use connect ourself without wires. Wireless technology is any technology in which wires have been eliminated in areas where wire were use before.We see wireless applications have generally in the field of communication. Example: radio,TV broadcasting ,mobile phones, bluetooth, wireless LAN.
Let us discuss some wireless technology that had come to make our work easy and comfortable.
(1) AM RADIO: Amplitude Modulation(AM) radio is use for long range transformation. In villages we are using AM so that we transform information to a long range. AM is widely used in small town too.EXAMPLE: Akashvani.
(2) FM RADIO: Frequency modulation(FM) radio is use for short range transformation. It is generally use in cities. FM povides better quality of sound.EXAMPLE: Radio City, My Fm,
(3)SHORT WAVE RADIO:Used for communications,handheld radios and walkie talkies.
(4) TV BROADCASTING: Directly modulated signals carrying live audio and video for television audience. EXMAPLE TATa sky.
(5) GSM: Global System for mobile communication is a cellular telphony technology . It operates on 900 ,1800 and 2700 MHz. EXAMPLE: Airtel,idea
(6) CDMA: this is a cellular telephony technology based ion code division.EXAMPLE: Tata indicom and Reliance.
(7) INFRARED: TV remote contrls use this as a means for communication. we generally see this technology in mobile phones.
(8) CORDLESS TELEPHONES:Use radio waves to avoid the wire connecting the handset and the base unit.
(9) GPS: Satellite-based positioning service that uses radio waves.
(10)BLUETOOTH:Its short range wirelesstechnology that allows several types of electronic devices to interconnect.It is too use in mobile phones.
(11) WI-FI: Its is part of radio technology . It is portable to wireless internet.
The CRM software finds its place in many operations in corporate environment including marketing, sales and most prominently customer service. Each of these corporate processes requires significant level of sophistication and organization on part of management and administration as they involve very sensitive data and resources that are vital for an organization. In such situations the CRM presents a clear and automated work flow accomplished with coordination among the relevant resources and processes.
The CRM when used as a marketing management application presents a high quality of automation and ease of access providing the marketing professionals with segmentation and data clenching tools, campaign management and such other management features. All these features insightfully boost the organizations marketing and analytic capabilities thus increasing the overall productivity of the organization. Most specific features of the marketing management application of CRM include Lead and Response management, Campaign management, Marketing analytics, Data and List management and Planning and Budgeting each having an important role in the marketing sector of an organization.
The official documentation of CRM presents the software as a sales and customer service automation application apart from a marketing management application that helps to manage and structure the relevant data to achieve faster response times. The CRM also helps in keeping track of the vital financial and sales data of an organization, thus making the robust task of sales management an organized and simple process. As a sales management application The CRM aids in improved sales productivity by providing enhanced functionality of most vital operations like Account Management, Opportunity Management, Lead Management, Sales Team and Territory Planning and Forecasting and Sales Analytics. All these processes when organized and carried out with the CRM software help the organizations to improve the sales lifecycle and improve close rates.
The CRM is a tool that makes easy interaction with the customers more than just a possibility. Customer communication, interaction, problem resolution and every aspect of pre and post sales could be achieved with minimum effort and cost on part of the organization. A centralized arrangement of customer preferences and activity history along with the relationship management helps in knowing the specific interests of the prospective customer. An automated and consistent follow-up activities help in improving the sales qualification and a clear tab on the customer responses help in accurate productive analysis of the sales and financial activity in the organization.
Hence the CRM software is undoubtedly a great tool for clear, easy and all round management of customer relationship concerning sales, marketing and both post and pre sales environments. CRM as a highly productive tool from Microsoft sets an industry standard yet again alongside a wide variety of server and corporate applications.
More Articles …
Page 58 of 73