RE: 5 pen pc technology documentation
||5 pen pc technology
5 pen pc technology doc.docx (Size: 568.95 KB / Downloads: 47)
Five pen pc shortly called as P-ISM (“Pen-style Personal Networking Gadget Package”), is nothing but the new discovery, which is under developing stage by NEC Corporation. P-ISM is a gadget package including five functions: a CPU pen, communication pen with a cellular phone function, virtual keyboard, a very small projector, and a camera. P-ISM’s are connected with one another through short-range wireless technology. The whole set is also connected to the Internet through the cellular phone function. This personal gadget in a minimalist pen style enables the ultimate ubiquitous computing.
The conceptual prototype of the "pen" computer was built in 2003. The prototype device, dubbed the "P-ISM", was a "Pen-style Personal Networking Gadget" created in 2003 by Japanese technology company NEC. The P-ISM was featured at the 2003 ITU Telecom World held in Geneva, Switzerland.The designer of the 5 Pen Technology, ”Toru Ichihash” , said that” In developing this concept he asked himself – “What is the future of IT when it is small?” The pen was a logical choice. He also wanted a product that you could touch and feel. Further, the intent is to allow for an office anywhere.”However, although a conceptual prototype of the "pen" computer was built in 2003, such devices are not yet available to consumers.An article about the device published on the Wave Report website in 2004 explains: At ITU Telecom World we got a sample of another view by NEC. It is based on the pen and called P-ISM. This concept is so radical that we went to Tokyo to learn more.
The functionality of the CPU is done by one of the pen. It is also known as computing engine. It consists of dual core processor embedded in it and it works with WINDOWS operation system.
The central processing unit (CPU) is the portion of a computer system that carries out the instructions of a computer program, and is the primary element carrying out the computer's functions. The central processing unit carries out each instruction of the program in sequence, to perform the basic arithmetical, logical, and input/output operations of the system. This
term has been in use in the computer industry at least since the early 1960s. The form, design and implementation of CPUs have changed dramatically since the earliest examples, but their fundamental operation remains much the same.
Early CPUs were custom-designed as a part of a larger, sometimes one-of-a-kind, and computer. However, this costly method of designing custom CPUs for a particular application has largely given way to the development of mass-produced processors that are made for one or many purposes. This standardization trend generally began in the era of discrete transistor mainframes and mini computers and has rapidly accelerated with the popularization of the integrated circuit (IC). The IC has allowed increasingly complex CPUs to be designed and manufactured to tolerances on the order of nanometers. Both the miniaturization and standardization of CPUs have increased the presence of these digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in everything from automobiles to cell phones and children's toys.
The control unit of the CPU contains circuitry that uses electrical signals to direct the entire computer system to carry out, stored program instructions. The control unit does not execute program instructions; rather, it directs other parts of the system to do so. The control unit must communicate with both the arithmetic/logic unit and memory.
CPU, core memory, and external bus interface of a DEC PDP-8/I. made of medium-scale integrated circuits.
The design complexity of CPUs increased as various technologies facilitated building smaller and more reliable electronic devices. The first such improvement came with the advent of the transistor. Transistorized CPUs during the 1950s and 1960s no longer had to be built out of bulky, unreliable, and fragile switching elements like vacuum tubes and electrical relays. With this improvement more complex and reliable CPUs were built onto one or several printed circuit boards containing discrete (individual) components.
The introduction of the microprocessor in the 1970s significantly affected the design and implementation of CPUs. Since the introduction of the first commercially available microprocessor (the Intel 4004) in 1970 and the first widely used microprocessor (the Intel 8080) in 1974, this class of CPUs has almost completely overtaken all other central processing unit implementation methods. Mainframe and minicomputer manufacturers of the time launched proprietary IC development programs to upgrade their older computer architectures, and eventually produced instruction set compatible microprocessors that were backward-compatible with their older hardware and software. Combined with the advent and eventual vast success of the now ubiquitous personal computer, the term CPU is now applied almost exclusively to microprocessors. Several CPUs can be combined in a single processing chip.
Previous generations of CPUs were implemented as discrete components and numerous small integrated circuits (ICs) on one or more circuit boards. Microprocessors, on the other hand, are CPUs manufactured on a very small number of ICs; usually just one. The overall smaller CPU size as a result of being implemented on a single die means faster switching time because of physical factors like decreased gate parasitic capacitance. This has allowed synchronous microprocessors to have clock rates ranging from tens of megahertz to several gigahertz’s. Additionally, as the ability to construct exceedingly small transistors on an IC has increased, the complexity and number of transistors in a single CPU has increased dramatically. This widely observed trend is described by Moore's law, which has proven to be a fairly accurate predictor of the growth of CPU (and other IC) complexity to date.
The fundamental operation of most CPUs, regardless of the physical form they take, is to execute a sequence of stored instructions called a program. The program is represented by a series of numbers that are kept in some kind of computer memory. There are four steps that nearly all CPUs use in their operation: fetch, decode, execute, and write back.
The first step, fetch, involves retrieving an instruction (which is represented by a number or sequence of numbers) from program memory. The location in program memory is determined by a program counter (PC), which stores a number that identifies the current position in the program. After an instruction is fetched, the PC is incremented by the length of the instruction word in terms of memory units. Often, the instruction to be fetched must be retrieved from relatively slow memory, causing the CPU to stall while waiting for the instruction to be returned. This issue is largely addressed in modern processors by caches and pipeline architectures.
The instruction that the CPU fetches from memory is used to determine what the CPU is to do. In the decode step, the instruction is broken up into parts that have significance to other portions of the CPU. The way in which the numerical instruction value is interpreted is defined by the CPU's instruction set architecture (ISA). Often, one group of numbers in the instruction, called the opcode, indicates which operation to perform. The remaining parts of the number usually provide information required for that instruction, such as operands for an addition operation. Such operands may be given as a constant value (called an immediate value), or as a place to locate a value: a register or a memory address, as determined by some addressing mode. In older designs the portions of the CPU responsible for instruction decoding were unchangeable hardware devices. However, in more abstract and complicated CPUs and ISAs, a micro program is often used to assist in
translating instructions into various configuration signals for the CPU. This micro program is sometimes rewritable so that it can be modified to change the way the CPU decodes instructions even after it has been manufactured.