oscilloscope

An oscilloscope is a laboratory instrument commonly used to display and analyze the waveform of electronic signals. In effect, the device draws a graph of the instantaneous signal voltageas a function of time.

A typical oscilloscope can display alternating current (AC) or pulsating direct current (DC) waveforms having a frequency as low as approximately 1 hertz (Hz) or as high as several megahertz (MHz). High-end oscilloscopes can display signals having frequencies up to several hundred gigahertz (GHz). The display is broken up into so-called horizontal divisions (hor div) and vertical divisions (vert div). Time is displayed from left to right on the horizontal scale. Instantaneous voltage appears on the vertical scale, with positive values going upward and negative values going downward.

Tennis for Two

Before the era of electronic ping pong, hungry yellow dots, plumbers, mushrooms, and fire-flowers, people waited in line to play video games at roller-skating rinks, arcades, and other hangouts. More than fifty years ago, before either arcades or home video games, visitors waited in line at Brookhaven National Laboratory to play “Tennis for Two,” an electronic tennis game that is unquestionably a forerunner of the modern video game. Tennis for Two was first introduced on October 18, 1958, at one of the Lab’s annual visitors’ days. Two people played the electronic tennis game with separate controllers that connected to an analog computer and used an oscilloscope for a screen. The game’s creator, William Higinbotham, was a nuclear physicist who had worked on the Manhattan Project and lobbied for nuclear nonproliferation as the first chair of the Federation of American Scientists

http://www.bnl.gov/bnlweb/history/higinbotham.asp

http://www.thecreatorsproject.com/blog/take-a-trip-through-the-ihistory-of-gamingi

Napster and The Death of The CD

Shawn Fanning (born November 22, 1980) is an American computer programmer, serial entrepreneur, and angel investor. He is famous for developing Napster, one of the first popular peer-to-peer (“P2P”) file sharing platforms, in 1998. The popularity of Napster was widespread and Fanning was featured on the cover of Time magazine.[2] The site in its initial free P2P incarnation was shut down in 2001 after the company’s unsuccessful appeal of court orders arising from its encouraging the illegal sharing of copyrighted material. A paid subscription version of the site followed, and remains the current format. Following his involvement with Napster, he joined, and invested in, a number of early-stage technology startup companies.

Digital rights management (DRM) is a term for access control technologies that are used by hardware manufacturers, publishers, copyright holders and individuals to limit the use of digital content and devices. The term is used to describe any technology that inhibits uses of digital content that is not desired or intended by the content provider. The term does not generally refer to other forms of copy protection, which can be circumvented without modifying the file or device, such as serial numbers or keyfiles. It can also refer to restrictions associated with specific instances of digital works or devices. Digital rights management is used by companies such as Sony, Amazon, Apple Inc., Microsoft, AOL and the BBC.

The use of digital rights management is controversial. Proponents argue it is needed by copyright holders to prevent unauthorized duplication of their work, either to maintain artistic integrity[1] or to ensure continued revenue streams.[2] Some opponents, such as the Free Software Foundation (through its Defective By Design campaign), maintain that the use of the word “rights” is misleading and suggest that people instead use the term digital restrictions management. Their position is essentially that copyright holders are restricting the use of material in ways that are beyond the scope of existing copyright laws, and should not be covered by future laws.[3] The Electronic Frontier Foundation, and other opponents, also consider the use of DRM systems to be anti-competitive practice.[4] This position holds that the rights of the user need legal protection.[5]

Many online music stores employ DRM to restrict usage of music purchased and downloaded online. There are many options for consumers wishing to purchase digital music over the internet:

  • Prior to 2009, Apple’s iTunes Store utilized the FairPlay DRM system for music. In May of 2007, EMI tracks became available in iTunes Plus format at a higher price point. These tracks were higher quality (256 kbps) and DRM free. In October of 2007, the cost of iTunes Plus tracks was lowered to US$0.99.[15] In April of 2009, all iTunes music became available completely DRM free. (Videos sold and rented through iTunes, as well as iOS Apps, however, were to continue using Apple’s FairPlay DRM.)
  • Napster music store offers a subscription-based approach to DRM alongside permanent purchases. Users of the subscription service can download and stream an unlimited amount of music transcoded to Windows Media Audio (WMA) while subscribed to the service. But when the subscription period lapses, all of the downloaded music is unplayable until the user renews his or her subscription. Napster also charges users who wish to use the music on their portable device an additional $5 per month. In addition, Napster gives users the option of paying an additional $0.99 per track to burn it to CD or for the song to never expire. Music bought through Napster can be played on players carrying the Microsoft PlaysForSure logo (which, notably, do not include iPods or even Microsoft’s own Zune). As of June 2009 Napster is giving DRM free MP3 music, which can be played on iPhones and iPods.
  • Wal-Mart Music Downloads, another online music download store, charges $0.94 per track for all non-sale downloads. All Wal-Mart, Music Downloads are able to be played on any Windows PlaysForSure marked product. The music does play on the SanDisk‘s Sansa mp3 player, for example, but must be copied to the player’s internal memory. It cannot be played through the player’s microSD card slot, which is a problem that many users of the mp3 player experience.
  • Sony operated an online music download service called “Connect” which used Sony’s proprietary OpenMG DRM technology. Music downloaded from this store (usually via Sony’s SonicStage software) was only playable on computers running Microsoft Windows and Sony hardware (including the PSP and some Sony Ericsson phones).
  • Kazaa is one of a few services offering a subscription-based pricing model. However, music downloads from the Kazaa website are DRM-protected, and can only be played on computers or portable devices running Windows Media Player, and only as long as the customer remains subscribed to Kazaa.

The various services are currently not interoperable, though those that use the same DRM system (for instance the several Windows Media DRM format stores, including Napster, Kazaa and Yahoo Music) all provide songs that can be played side-by-side through the same player program. Almost all stores require client software of some sort to be downloaded, and some also need plug-ins. Several colleges and universities, such as Rensselaer Polytechnic Institute, have made arrangements with assorted Internet music suppliers to provide access (typically DRM-restricted) to music files for their students, to less than universal popularity, sometimes making payments from student activity fee funds.[16] One of the problems is that the music becomes unplayable after leaving school unless the student continues to pay individually. Another is that few of these vendors are compatible with the most common portable music player, the Apple iPod. The Gowers Review of Intellectual Property (to HMG in the UK; 141 pages, 40+ specific recommendations) has taken note of the incompatibilities, and suggests (Recommendations 8—12) that there be explicit fair dealing exceptions to copyright allowing libraries to copy and format-shift between DRM schemes, and further allowing end users to do the same privately. If adopted, some of the acrimony may decrease.

Although DRM is prevalent for Internet music, some online music stores such as eMusic, Dogmazic, Amazon, and Beatport, do not use DRM despite encouraging users to avoid sharing music. Another online retailer, Xiie.net, which sells only unsigned artists, encourages people to share the music they buy from the site, to increase exposure for the artists themselves. Major labels have begun releasing more online music without DRM. Eric Bangeman suggests in Ars Technica that this is because the record labels are “slowly beginning to realize that they can’t have DRMed music and complete control over the online music market at the same time… One way to break the cycle is to sell music that is playable on any digital audio player. eMusic does exactly that, and their surprisingly extensive catalog of non-DRMed music has vaulted it into the number two online music store position behind the iTunes Store.”[ Apple’s Steve Jobs has called on the music industry to eliminate DRM in an open letter titled Thoughts on Music. Apple’s iTunes store will start to sell DRM-free 256 kbit/s (up from 128 kbit/s) AAC encoded music from EMI for a premium price (this has since reverted to the standard price). In March 2007, Musicload.de, one of Europe’s largest online music retailers, announced their position strongly against DRM. In an open letter, Musicload stated that three out of every four calls to their customer support phone service are as a result of consumer frustration with DRM.

CERN AMD Tim Berners Lee-TAKEN FROM THE CERN SITE

In 1989, while working at at CERN, the European Particle Physics Laboratory in Geneva, Switzerland, Tim Berners-lee proposed a global hypertext project, to be known as the World Wide Web. Based on the earlier “Enquire” work, it was designed to allow people to work together by combining their knowledge in a web of hypertext documents. He wrote the first World Wide Web server, “httpd“, and the first client, “WorldWideWeb” a what-you-see-is-what-you-get hypertext browser/editor which ran in the NeXTStep environment. This work was started in October 1990, and the program “WorldWideWeb” first made available within CERN in December, and on the Internet at large in the summer of 1991.

Where the web was born

Tim Berners-Lee, a scientist at CERN, invented the World Wide Web (WWW) in 1989. The Web was originally conceived and developed to meet the demand for automatic information sharing between scientists working in different universities and institutes all over the world.

CERN is not an isolated laboratory, but rather a focus for an extensive community that now includes about 60 countries and about 8000 scientists. Although these scientists typically spend some time on the CERN site, they usually work at universities and national laboratories in their home countries. Good contact is clearly essential.

The basic idea of the WWW was to merge the technologies of personal computers, computer networking and hypertext into a powerful and easy to use global information system.

CERN is the European Organization for Nuclear Research. The name is derived from the acronym for the French Conseil Européen pour la Recherche Nucléaire, or European Council for Nuclear Research, a provisional body founded in 1952 with the mandate of establishing a world-class fundamental physics research organization in Europe. At that time, pure physics research concentrated on understanding the inside of the atom, hence the word ‘nuclear’.

When the Organization officially came into being in 1954, the Council was dissolved, and the new organization was given the title European Organization for Nuclear Research, although the name CERN was retained.

Today, our understanding of matter goes much deeper than the nucleus, and CERN’s main area of research is particle physics — the study of the fundamental constituents of matter and the forces acting between them. Because of this, the laboratory operated by CERN is commonly referred to as the European Laboratory for Particle Physics.

How the web began

The first proposal for the World Wide Web (WWW) was made at CERN by Tim Berners-Lee in 1989, and further refined by him and Robert Cailliau in 1990.

By the end of that year, prototype software for a basic system was already being demonstrated. To encourage its adoption, an interface to the CERN Computer Centre’s documentation, to the ‘help service’ and also to the familiar Usenet newsgroups was provided.

The first web servers were all located in European physics laboratories and only a few users had access to the NeXT platform on which the first browser ran. CERN soon provided a much simpler browser, which could be run on any system.

In 1991, an early WWW system was released to the high energy physics community via the CERN program library. It included the simple browser, web server software and a library, implementing the essential functions for developers to build their own software. A wide range of universities and research laboratories started to use it. A little later it was made generally available via the Internet, especially to the community of people working on hypertext systems.

Going global

The first web server in the United States came on-line in December 1991, once again in a pure research institute: the Stanford Linear Accelerator Center (SLAC) in California.

At this stage, there were essentially only two kinds of browser. One was the original development version, very sophisticated but only available on the NeXT machines. The other was the ‘line-mode’ browser, which was easy to install and run on any platform but limited in power and user-friendliness. It was clear that the small team at CERN could not do all the work needed to develop the system further, so Berners-Lee launched a plea via the Internet for other developers to join in.

Several individuals wrote browsers, mostly for the X-window system. The most notable from this era are MIDAS by Tony Johnson from SLAC, Viola by Pei Wei from O’Reilly, Erwise by the Finns from the Helsinki University of Technology.

Early in 1993, the National Center for Supercomputing Applications (NCSA) at the University of Illinois released a first version of their Mosaic browser. This software ran in the X Window System environment, popular in the research community, and offered friendly window-based interaction. Shortly afterwards the NCSA released versions also for the PC and Macintosh environments. The existence of reliable user-friendly browsers on these popular computers had an immediate impact on the spread of the WWW. The European Commission approved its first web project (WISE) at the end of the same year, with CERN as one of the partners. By late 1993 there were over 500 known web servers, and the WWW accounted for 1% of Internet traffic, which seemed a lot in those days! (The rest was remote access, e-mail and file transfer.) 1994 really was the ‘Year of the Web’. The world’s First International World Wide Web conference was held at CERN in May. It was attended by 400 users and developers, and was hailed as the ‘Woodstock of the Web’. As 1994 progressed, the Web stories got into all the media. A second conference, attended by 1300 people, was held in the US in October, organised by the NCSA and the already created the International WWW Conference Committee (IW3C2).

By the end of 1994, the Web had 10,000 servers, of which 2,000 were commercial, and 10 million users. Traffic was equivalent to shipping the entire collected works of Shakespeare every second. The technology was continually extended to cater for new needs. Security and tools for e-commerce were the most important features soon to be added.

Open standards

An essential point was that the Web should remain an open standard for all to use and that no-one should lock it up into a proprietary system.
In this spirit, CERN submitted a proposal to the Commission of the European Union under the ESPRIT programme: ‘WebCore’. The goal of the project was an International Consortium, in collaboration with the US Massachusetts Institute of Technology (MIT). Berners-Lee officially left CERN at the end of 1994 to work on the Consortium from the MIT base. But with approval of the LHC project clearly in sight, it was decided that further Web development was an activity beyond the Laboratory’s primary mission. A new home for basic Web work was needed.

The European Commission turned to the French National Institute for Research in Computer Science and Controls (INRIA), to take over the role of CERN.

In January 1995, the International World Wide Web Consortium (W3C) was founded ‘to lead the World Wide Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability’.

By 2007 W3C, run jointly by MIT/LCS in the US, INRIA in France, and Keio University in Japan, had more than 430 member organizations from around the world.

What is True Type

If you are sitting at a Windows or Macintosh computer right now, then you are looking at a TrueType font as you read this! Fonts are the different styles of typefaces used by a computer to display text. If you are like most people, you are probably looking at text in many different sizes and you may even want to print out a document. Early computer operating systems relied on bitmapped fonts for display and printing. These fonts had to be individually created for display at each particular size desired. If you made the font larger or smaller than it was intended to be, it looked horrible. And printed text was almost always very jagged looking.

In the late 1980s, Adobe introduced its Type 1 fonts based on vector graphics. Unlike bitmapped fonts, vector fonts could be made larger or smaller (scaling) and still look good. Adobe also developed a printing language called Postscript that was vastly superior to anything else on the market. Microsoft and Apple were very interested in these technologies but did not want to pay royalties to Adobe for something that could become an integral part of both companies’ operating systems. For that reason, Microsoft and Apple joined to develop vector font and printing technology of their own. In the end, Apple actually developed the font technology, TrueType. Meanwhile, the print engine being developed by Microsoft, TrueImage, never really got off the ground.

READ MORE

Brief History of Adobe

For 25 years, Adobe Systems Inc. has pushed publishing and printing boundaries to the limits. With their proprietary PDF format, computer scientists John Warnock and Charles Geschke have established a company that has become the software provider of choice for a wide range of industries.

Before founding Adobe Systems, Inc. in 1982, both men worked at the prominent Xerox Palo Alto Research Center (PARC) in the late 1970’s. The inspiration to create Adobe came from the research they conducted on device-independent graphic systems and printers. Their goal as technological innovators was to translate digital text and images onscreen accurately onto the printed page. This idea would be the motivating force behind Adobe’s constant innovation and re-invention of technology.

Read More!

Introduction to TCP/IP

Summary: TCP and IP were developed by a Department of Defense (DOD) research project to connect a number different networks designed by different vendors into a network of networks (the “Internet”). It was initially successful because it delivered a few basic services that everyone needs (file transfer, electronic mail, remote logon) across a very large number of client and server systems. Several computers in a small department can use TCP/IP (along with other protocols) on a single LAN. The IP component provides routing from the department to the enterprise network, then to regional networks, and finally to the global Internet. On the battlefield a communications network will sustain damage, so the DOD designed TCP/IP to be robust and automatically recover from any node or phone line failure. This design allows the construction of very large networks with less central management. However, because of the automatic recovery, network problems can go undiagnosed and uncorrected for long periods of time.

As with all other communications protocol, TCP/IP is composed of layers:

  • IP – is responsible for moving packet of data from node to node. IP forwards each packet based on a four byte destination address (the IP number). The Internet authorities assign ranges of numbers to different organizations. The organizations assign groups of their numbers to departments. IP operates on gateway machines that move data from department to organization to region and then around the world.
  • TCP – is responsible for verifying the correct delivery of data from client to server. Data can be lost in the intermediate network. TCP adds support to detect errors or lost data and to trigger retransmission until the data is correctly and completely received.

Read More!

ARPANET — The First Internet

In 1957 ARPA (Advanced Research Projects Agency) is developed to by the military
to ensure technological superiority for the United States. A necessity brought about supposedly by the launch of the Russian satellite
– Sputnik. This would become the basis for the development of the Internet. ARPAnet (later the Internet) was developed by a group of academics and military personnel
from within this country’s foremost technology based universities. Once ARPAnet is up and running the next step is to focus on delivery of this computer formulated information.

ARPANET deployed

 

Historical document: First ARPANET IMP log: the first message ever sent via the ARPANET, 10:30 PM, 29 October 1969. This IMP Log excerpt, kept at UCLA, describes setting up a message transmission from the UCLA SDS Sigma 8 Host computer to the SRI SDS 940 Host computer

The initial ARPANET consisted of four IMPs:

The first message on the ARPANET was sent by UCLA student programmer Charley Kline, at 10:30 p.m, on October 29, 1969 from Boelter Hall 3420.[7] Supervised by Prof. Leonard Kleinrock, Kline transmitted from the university’s SDS Sigma 7 Host computer to the Stanford Research Institute’s SDS 940 Host computer. The message text was the word “login”; the “l” and the “o” letters were transmitted, but the system then crashed. Hence, the literal first message over the ARPANET was “lo”. About an hour later, having recovered from the crash, the SDS Sigma 7 computer effected a full “login”. The first permanent ARPANET link was established on November 21, 1969, between the IMP at UCLA and the IMP at the Stanford Research Institute. By December 5, 1969, the entire four-node network was established.[8]

The contents of the first email transmission in 1971 have been forgotten; in the Frequently Asked Questions section of his Web site, the sender, Ray Tomlinson, who sent the message between two computers sitting side-by-side, claims that the contents were “entirely forgettable, and I have, therefore, forgotten them”, and speculates that the message likely was “QWERTYUIOP” or some such.[9]

The ARPANET was the first wide area packet switching network, the “Eve” network of what has evolved into the Internet we know and love today.

The ARPANET was originally created by the IPTO under the sponsorship of DARPA, and conceived and planned by Lick Licklider, Lawrence Roberts, and others as described earlier in this section.

The ARPANET went into labor on August 30, 1969, when BBN delivered the first Interface Message Processor (IMP) to Leonard Kleinrock‘s Network Measurements Center at UCLA. The IMP was built from a Honeywell DDP 516 computer with 12K of memory, designed to handle the ARPANET network interface. In a famous piece of Internet lore, on the side of the crate, a hardware designer at BBN named Ben Barker had written “Do it to it, Truett”, in tribute to the BBN engineer Truett Thach who traveled with the computer to UCLA on the plane.

Read More…