Today is an electrical and electronic world in which we can store data in the memory. Memory is also of different types in which main memory and secondary memory people know basically. Apart from these two types of memory, there is a very important type of memory known as CACHE memory. Pronounced as cash, the cache is a special high-speed storage mechanism. It can be either a reserved section of main memory of an independent high speed storage device. Whenever some data is required, the CPU first looks in the cache, if it is there or not. If the data is found in cache, CPU does not access memory and hence the process becomes very fast.
A memory cache, sometimes called a cache store of RAM cache, is a portion of memory made of high-speed static RAM (SRAM) instead of the slower and cheaper dynamic RAM (DRAM) used for main memory. Memory caching is effective because most programs access the same data or instructions over and over. By keeping as much of this information as possible in SRAM, the computer avoids accessing the slower DRAM.
Examples
Some memory cache is built in to the architecture of microprocessors. The Intel 80486 microprocessor, for example, contains an 8K memory cache, and the Pentium has 16K cache. Such internal caches are often called level 1 (L1) caches. Most moderns PCs also come with internal cache memory. And level 2 (L2) caches. These caches sit between the CPU and DRAM. Like L1 caches, L2 caches are composed of SRAM but they are much larger.
Principle
Disk caching works under the same principle as memory caching, but instead of using high-speed SRAM, a disk cache uses conventional main memory. The most recently accessed data from the disk as well as adjacent sectors is stored in a memory buffer. When a program needs to access data from the disk, it first checks the disk cache to see if the data there. Disk caching can drastically improve the performance of application, because accessing a byte of data in RAM can be thousands of times faster than accessing a byte on a hard disk.
When data is found on cache, it is called cache hit, and the effectiveness of a cache is judged by hit rate. Many cache system use a technique known as smart caching, in which the system can recognize certain types of frequently used data. The strategies for determining which information should be kept in the cache constitute some of the more interesting problems in computer science.
Types of cache memory
There are two types of cache memory – level 1 (L1) cache and level 2 (L2) cache.
The L1 cache is built inside the CPU package while the L2 is external to CPU and resided on the motherboard. Typical L2 cache memory sizes are – 64/128 KB in 386DX systems, 128/256 KB in 486 systems and 256/512 KB in Pentium systems. In sixth generation processor from pro onwards, both L1 and L2 cache have been integrated within the CPU packages to reduce access time and improve system performance further. The L2 cache on motherboard works at motherboard bus speed while the L1 cache inside the CPU clock speed. For example, in a Pentium II 450 MHz system the L2 cache works at 225 MHz more over in Pentium III we find in-die 256 KB of L2 cache that works at the same clock speed of the CPU, which means minimum possible wait time and maximum system performance.
The internal L1 cache has been a feature from 486 onwards. 486 processors have a combined L1 8 KB instruction and data cache, while classic Pentium processors have16 KB of L1 cache and Dcache, each of capacity 32 KB. Since theL1 cache is on-die, it is on die, it operates at the same clock speed of the CPU.
Telecommunication system
A telecommunication system consists of hardware and software that transmits information from one location to another. These systems can transmit text, data, graphics, voice, documents or full-motion video information. The major components of a telecommunication system include the following.
Hardware-Includes all types of computer (e.g., desktop, server, main frame) and communication processors.
Communication media- the physical media through which electronic signals are transmitted, including wireless media.
Communication networks-the links among computers and communications devices.
Communication software- software that controls the telecommunication system and the entire transmission process.
Data communication providers: regulated utilities are private firms that provide data communication services.
Communication protocols -the rules for transmitting information across the system.
Communication supplication –electronic data interchange teleconferencing, videoconferencing, electronic mail, facsimile and electronic funds transfer, as well as others.
The system must do all of the following: transmit information; establish the interface between the sender and the receiver; rout messages along the most efficient paths; ensure that right message gets to the right receiver; check the message for errors and rearrange the format if necessary; convert messages one speech to another (computers are usually faster than a communication medium); ensure that the sending devices, receiving devices and communication links are operational (in words, maintain the network); and secure the information at all times.
Signals:
Telecommunications media carry two basic types of signals, analog and digital. Analog signals are continuous waves that transmit information by altering the characteristics of the waves. Analog signals have two parameters, amplitude and frequency.
Digital signals do not have the characteristics “wave” shape that analog signals do. Rather, they are discrete pulse that is either on or off. This quality allows them to convey information in a binary form that can be clearly interpreted by computers.
Communication processors:
Communications processors are hardware devices that support data transitions and receptions across a telecommunication system. These devices include modems, multiplexers, front – end processors, and concentrators.
Modem:
The conversion from digital to analog is called modulation, and the reverse demodulation. The device that performs these two processors is called a modem, a contraction of the terms modulates/demodulate. A modem’s transmission speed is measured in bytes per seconds (bps). Typical modem speeds range from 14,400 to 56,600 bps.
Multiplexer:
A multiplexer is an electronic device that allows a signal communications channel to carry data transitions simultaneously from many sources. Multiplexers lower communication cost by allowing devices to share communication channels.
Front-end processors:
With most computer, the center processing unit (CPU) must communicate with several computers at the same time. Routine communication task can absorb a large proportion of the CPU’s processing time. In order not to waste valuable CPU time, many computer systems have a small secondary computer dedicated solely to communication known as front-end processor, this specialized computer manager all routing communication with peripheral devices. The functions of a front-end processor include coding and decoding data; deducting errors; and recovering, recording, interpreting and processing the control information that is transmitted.
Communication media and channels:
For data to be communicated from one location to another, some form of pathway or medium must be used. These path ways are called communications channels, and they include twisted – pair wire, coaxial cable, fiber optics cable, microwave transmission, and satellite transmission.
Cable media:
Cable media use physical wires or cables to transmit data and information. There are three types of cable media: twisted pair wire, coaxial cable, and fiber optic cable. Twisted-pair wire and coaxial cable are made of copper and fiber optic cable is made of glass.
Twisted pair wire:
Twisted pair wire is the oldest and still the most prevalent form of communication wiring. Twisted pair wire consists of two insulated copper wires, typically about 1 mm thick. The wires are twisted together in a helical form, just like a DNA molecules. The purpose of twisting a wire is to reduce electrical interference from similar pairs close by. The most common application of the twisted pair is the telephone system. A twisted pair connects nearly all telephones to the telephone company office. Twisted pair can run several kilometers without amplification, but for longer distances repeaters are needed. It is relatively inexpensive to purchase, widely available, and easy to work with, and it can be made relatively unobtrusive by running it inside walls, floors and ceilings. However, twisted-pair wire has some significantly disadvantages. It emits electromagnetic interference, is relatively slow for transmitting data, is subject to interference from the electrical source, and can be easily “taped” for gaining unauthorized access to data by unintended receivers.
Coaxial cable:
A coaxial cable consists of a stiff copper wire as the core surrounded by an insulating material. A cylindrical conductor encases the insulator, offer as a closely woven braided mesh. The outer conductor is covered in a protective plastic sheath. It has better shielding than twisted pairs, and can carry much more data for longer distances at higher speeds. Coaxial cable can cost from 10 to 20 times more than twisted pair wire. Also, because of its flexibility, it can increase the cost of installation or re-cabling when must be moved.
Data transmission over coaxial cable is divided in to two basic types: base band transmission- transmission is analog, and each wire carries only one signal at a time.
Broad band transmission – transmission is digital, and each wire can carry multiple signals simultaneously.
Fiber optic cable
Fiber optics cable consists of thousands of very thin filaments of glass fibers that conduct light pulses generated by lasers at very high speed transmission frequencies. Cladding, a coating that prevents a light from leaking out of the fiber, surrounds the fiber optics. An optical transmission system has thus three components: the light source, the transmission system and the detector. Conventionally, a pulse of light indicates a 1 bit and the absence of light indicates a zero bit. The ultra thin fiber of glass is the transmission medium. The detector generates an electrical pulse when light falls on it. By attaching a light source to one end of the optical fiber and a detector to the other, we have a unidirectional data transmission system that accepts an electrical signal, converts and transmits it by light pulses, then reconverts the outputs in to electrical signal at the receiving end. A signal glass fiber can carry more than 30,000 simultaneous telephone calls, compared to about 5,500 calls on a standard copper coaxial cable. Optical fiber as a proven performance a data transmission rates of 2.5 million bites (giga bits) per second.
Optical networking
Optical networking increase fiber optic capacity by adding 16, 32 or up to 80 additional wavelengths to a signal piece optic filament. The key to an optical network is that signals coming in to the network are converted immediately to colors of light and are not converted immediately to colors of light and are not converted to electrical signals. At this time, there are no defined limits to how much bandwidth a signal fiber hold.
Wireless media
The key of mobile communications in today’s rapidly moving society is data transmission over electromagnetic media – the “air waves”. Common wireless data transmission include microwaves transmission include microwaves transmission, communication satellite, pages, cellular phones, mobiles data networks, personal communications services, and personal digital assistants.
Microwave:
Microwaves systems are widely used for high-volume, long-distance, point-to-point communication. Microwaves owner can usually not be spaced more than 30 miles apart because the earth’s curvature would interrupt the sight from tower to tower. To minimize line of sigh problem, microwaves antennas are usually placed on top of buildings, towers and mountains peak. Long-distance telephone carrier use microwave systems because they generally provide about ten times the data-carrying capacity of wire. Microwave transmissions are susceptible to environmental interference during severe whether such as heavy rain or snowstorms. Microwave also relatively inexpensive.
Satellite
A major advance in communication in recent year is the use of communication satellites for digital transmissions. As with microwave transmission satellites must receive and transmit via line of sight. However, the enormous footprint of satellite’s coverage area from high altitudes overcomes the limitations of microwaves data relay station. Currently, there are three types of orbits in which satellite are placed: geostationary earth orbit, medium earth orbit and low earth orbit.
A global positioning system (GPS) is a wireless system that uses satellites to enable users to determine their position anywhere on the earth. GPS equipment has been used for navigation by commercial airlines and ships. 24 satellites that are shared worldwide support GPS. Each satellite orbits the earth once in 12 hours, on the precise path at an altitude of 10,900 miles.
Radio
Radio electromagnetic data communications do not have to depend on microwave or satellite links, especially for short ranges such as within an office setting. Radio is being used increasingly to connect computers and peripherals equipments or computers and local area networks. For data communications the greatest advantage of radio is that no metallic wires are needed. Radio waves tend to propagate easily through normal office walls. Radio waves also are Omni directional. Meaning that they travel in all directions from the source, so that the transmitter and receiver do not have to be carefully aligned physically. The devices are fairly inexpensive and easy to install. Radio also allows for high data transmission speeds.
Infrared
Infrared light is red light not commonly visible to human eyes. It can be modulated or pulsed for conveying information. The most common application of infrared light in television or video cassette recorder remote control units. With computers and local area networks. Many portable PC’s can bring with infrared ports, which are handy when cable connection with a peripheral (such as a printers or modem) is not practical.
Disadvantages
Signals are not secure there are available to all receivers with in the footprint, intended or not. Some frequencies are susceptible to interference from bad weather or ground based microwaves signals.
Standards for accessibility for the disabled and aging populations upset many habits in organizations that undertake to establish their practices in Web development. The requirements of these standards are questioning practices often deemed adequate so far, proven and best. The desire to integrate these new quality concerns within a team production leads to sudden changes that may destabilize the profitability of a production website.
Although the principles applied in practice accessibility is not difficult to implement, the greater the risk of falling into some traps. This article aims to identify those in which organizations are most likely to fall. This list is not exhaustive, but for the sake of this article, we retained only those most frequently identified in our practice.
Intimately linked to each other, these traps often cause a cascade of negative consequences for the potential accessibility of Web projects. Here's the list:
Consider accessibility as a step in the end and ignore the cross-practice accessibilityAssign responsibility for accessibility to Web integrators onlyLimit the accessibility validation to those requirements outlined in the standardsConfuse the concepts of standard compliance and accessibility of formal accessibilityAssume that the validation tools will do the jobUnderestimating the impact of technology platformsIdentified for each trap, we will attempt to analyze the nature and cause of the problem. We then present some tips to quickly integrate prevention mechanisms that will promote their avoidance.
Some common pitfallsConsider accessibility as a step in the end and ignore the cross-practice accessibility
The first trap that many fall managers and project managers during the planning stages for a Web project, is to perceive accessibility as an additional link that is added to the end of term production chain web .
In this false perspective, accessibility is seen as a grid to provide additional validation when the control phase of the project's quality, just before the launch of the result. The concept of grid control that comes with accessibility standards is certainly no stranger to this perception.
In the case of a formal end-term accessibility, the accessibility experts too often observe that certain decisions have undermined the potential accessibility of the project. As an example, the following situations:
navigation systems on a website that are entirely dependent on the mouse, and therefore unusable for anyone who navigates with the keyboard;of technology choices naturally incompatible with adaptive technology or computer operated in a manner inconsistent with them;content available online in downloadable formats rather than HTML and are impossible to interpret by screen readers;text alternative to non-text content that were not foreseen by the developers as the editor, or even the organization was prepared to predict and devote the necessary effort;graphic layouts color contrasts insufficient, while the visual signature is already plastered across the city and on the web.Consequently, the organization is often found before the statement that it is too late to get it right. Either because budgets or schedules do more to reverse, or because the technology choices limit the ability of a team to produce accessible content. However, the mere fact of asking the right questions at the right time helps to avoid failures in the vast majority of cases.
Unlike dimensions "traditional" Web, such as design, analysis, ergonomics, usability, writing, and even the integration of programming, accessibility does not register as a separate step and additional. It spreads into every link of a chain of web production to provide for each intervention at the appropriate time. This is a mainstreaming.
Some choices made at the beginning and middle of course facilitate the implementation of accessibility or, conversely, generate barriers. Mainstreaming accessibility avoids significant omissions, particularly in technical and strategic choices. These oversights can result in costly delays and additional efforts to correct the situation, when patches are possible. As is known, it is always better than cure.
Any manager or project manager must recognize this peculiarity of the transversality of accessibility. He will then understand the need to determine which links in the production chain must be assigned the Web different responsibilities and tasks associated with requirements outlined in the standards.
Assign responsibility for accessibility to Web integrators only
Currently, the responsibility to implement the requirements of accessibility is still too often the only Web integrator. It is illusory to believe that it can rest on the shoulders of a champion of accessibility within an organization.
While the vast majority of accessibility requirements require intervention of Web integrators, several decisions and policies must be taken in the early stages of the project to ensure efficient work. For an accessibility approach proves successful, it is important that each player of a team understand the accessibility features within its responsibility. This will avoid returning:
the editors ask for text equivalents images;designers to review their graphic to adopt sufficient color contrast;programmers to provide an explicit association between labels and their corresponding fields in a form;or the project authority to challenge the technological choices that limit or prevent the fulfillment of certain requirements.That we consider to analysts, ergonomists, designers, editors or information architects, everyone is called to make decisions or take actions that may affect the potential accessibility of the project. To prevent these interventions have a negative impact on the final outcome, it is important that accessibility becomes a shared responsibility.
Limit the accessibility validation to those requirements outlined in the standards
The development of an accessible website is very often perceived as an audit to be performed from a small list of demands, as if it were enough to verify compliance with each requirement to produce an accessible website. In fact, it's not that simple.
The implementation of all requirements outlined in the standard achieves compliance with a standard. However, the systematic application of these requirements alone is not a sufficient guarantee of accessibility: it should also include functional testing with computer adaptive technologies that can validate the user experience is positive. Indeed, it is quite possible to produce a web site meets all accessibility requirements, but is still not available on some aspects. Missed some small subtleties are enough to cause a significant degree of confusion for the user not being able to perceive visual interface.
For functional testing means the use of computer adaptive technology such as voice synthesizers, screen readers and magnification software to verify that the results interpreted by these tools corresponds to the result observed in a visual validation of Web pages. For example, an absence of punctuation in the texts to replace images or a visual punctuation based solely on the provision or the reading order of content presented in HTML table, can produce a result very different from what we had imagined. Similarly, it is often re-reading a Web page with a screen reader that there is some missing information will be transmitted visually in speech or Braille. Playback speech may also be due to confusion with the pronunciation or the sequence of content (eg 'standards and norms "and" enormous standard ").
By adding functional tests to test techniques, producers of Web content make sure not to let anything escape during quality controls related to accessibility.
Confuse the concepts of compliance with accessibility standards and accessibility into
For most project managers and web developers, the concepts of compliance with accessibility standards and formatting Web content accessibility are interchangeable. However, these two concepts have very different meanings, and one can not be a source of assurance for each other.
The notion of compliance with accessibility standards based on respect effective, satisfactory and consistent with all requirements listed in a standard. In the field of standardization, conformity to a standard means that the application meets all the requirements thereof. The result is dichotomous: the application is compliant or it is not. In standardization, a statement such as: This website conforms to 60% in the standard makes no sense. This concept is objective because it is limited to the application of the requirements and measurable.
The concept of development based on access to it on an effort to bring down a maximum of barriers to the use of Web content for people with disabilities. Subjective concept, it requires the implementation of several measures to adapt content to meet specific user needs. Contrary to the notion of conformity to the standard of accessibility, it involves functional testing to ensure that beyond the compliance standards, efforts have real effects on the user experience of disabled people who visit these content.
The results of the implementation of accessibility can not be measured objectively, however, because it is impossible to judge at what point a website is "sufficiently accessible". Thus, it is the Web developer, according to the intentions of the organization or budget available to determine the threshold at which the result is considered satisfactory.
It is therefore possible to produce a standards compliant website that does not quite yet the needs of users because some details have escaped the developers. In the same spirit, a website considered "reasonably accessible" in the eyes of the developers themselves may not fully meet the needs of a particular clientele that would have been ignored during the process. Hence the importance of combining the two concepts to reach the widest possible result.
Assume that the validation tools will do the job
Accessibility can be measured only by the presence or absence of certain key elements that an assessment tool could automatically detect. Just as ergonomics, accessibility is good judgment developers and therefore a competent human evaluation.
In an ideal world, the concept of quality control machinery would fall. It would be pointless to address the adaptation needs of disabled people on the Web, because the tools would be able to take care of all this concern. According to the legislation in place in the respective countries where they come from, the tools are often accompanied by promotional claims that have to inflate their abilities on the potential accessibility. Several managers and project managers tend to believe that these tools can do all the work for them, but in fact it is not. Whoever gives the responsibility for the goal of accessibility in the hands of the tools, or platforms, it uses is doomed to experience great disappointment the day he will undertake assessment tests to measure the actual level of accessibility project.
Of all the accessibility requirements contained in standards, testing of only a minority can be automated to the point where human intervention would be unnecessary. It is generally estimated at around 30% the number of requirements for which verification can be completely automated. For example, it is easy to programmatically check the presence of an attribute alt for the replacement text of an image, it is impossible to check automatically if the value of the text equivalent before he takes over all the Text appearing in the picture.
Underestimating the impact of technology platforms
In addition to the interventions required for each requirement to distribute in the production chain web, it is crucial that some strategists in the team of web production (usually responsible for decisions such as technical managers, project managers or analysts) , questioned the potential tools and technology choices considered. Indeed, the very best of web, with the most stringent processes for accessibility, may not achieve its compliance goals if he agreed tools make the task impossible.
In addition, several managers and project managers mistakenly assume that since they rely on high technology platforms acquired at high prices, accessibility issues will necessarily be supported automatically and successfully.
Accessibility in practice, many organizations are struggling with all kinds of technology platforms that meet the most basic accessibility collide with the technical limitations of existing tools. For example, seemingly simple changes that require easy correction to the HTML or XHTML are impossible to perform in practice because Web developers do not have access to the source code of their tools. In this context, it is not possible to expect to achieve the goals of standards compliance.
While the technology platforms in OSS generally greater potential accessibility due to the possibility of modifying the generated code to the same tools, we should not conclude that the tools provided owners can ensure compliance with accessibility requirements. Some tools are more flexible than others. The important thing is to be aware of this reality and ask the hard questions when it comes time to choose a platform over another.
Planned approach to doing business, contrary to existing fears proved to be excellent in many economically developed countries. The majority of U.S. enterprises, and Western Europe are working on exactly such a scheme. Indicators KPI are key in assessing ore every employee - can understand the contribution of team members in a common cause, and on the basis of the calculation to determine the size of the salary of each. Existing key performance indicators enable to motivate employees to work more intensively, bringing, thus, benefit more common cause. In some cases it makes sense to be guided by the evaluation system for determining the contribution of each employee to improve the company's profits? Indicators should be realistic, transparent and understandable. When a man knows what he should do, he is able to create more value, thereby contributing to, the prosperity of a commercial organization. Typically, the problem with the company for the KPI system uniform, only the methods of implementation are different. Objective system of evaluation is based on common principles: • based on objective indicators of KPI, and most importantly reliable data; • measuring staff performance is conducted on the general scheme; • motivation of successful employees, enhancing the effectiveness of their work is simple and straightforward.
In the case where the management staff has full information and be able to adequately evaluate the work of team members, then work on the KPI system itself is completely justified. But in reality happens is that, as the saying goes, "right hand does not know what the left." The head office can put in front of subordinate plans that are impossible to perform, tasks that do not fall within the competence of performers, etc. All this hinders the development of the company, does not give it to develop. The team members are no longer interested in its effectiveness, that adversely affects the state budget commercial structure. On the other hand, it is important to work to KPI system is not inhibited activity of the team, but rather encouraged employees to work more effectively. It is crucial to determine the weighted, a careful approach to determining the duties of each employee, outlining the terms of his official duties.
More Articles …
Subcategories
Web Hosting
Web Hosting is a service offered by web hosting providers to the individuals and organizations to make their websites accessible on the internet. Depending on the requirement, one can avail different types of web hosting such as shared hosting, dedicated hosting, virtual private hosting, cloud hosting etc.
Page 22 of 193