Jumat, 28 November 2008
search engine worked
Web search engines work by storing information about many web pages, which they retrieve from the WWW itself. These pages are retrieved by a Web crawler (sometimes also known as a spider) — an automated Web browser which follows every link it sees. Exclusions can be made by the use of robots.txt. The contents of each page are then analyzed to determine how it should be indexed (for example, words are extracted from the titles, headings, or special fields called meta tags). Data about web pages are stored in an index database for use in later queries. Some search engines, such as Google, store all or part of the source page (referred to as a cache) as well as information about the web pages, whereas others, such as AltaVista, store every word of every page they find. This cached page always holds the actual search text since it is the one that was actually indexed, so it can be very useful when the content of the current page has been updated and the search terms are no longer in it. This problem might be considered to be a mild form of linkrot, and Google's handling of it increases usability by satisfying user expectations that the search terms will be on the returned webpage. This satisfies the principle of least astonishment since the user normally expects the search terms to be on the returned pages. Increased search relevance makes these cached pages very useful, even beyond the fact that they may contain data that may no longer be available elsewhere.
When a user enters a query into a search engine (typically by using key words), the engine examines its index and provides a listing of best-matching web pages according to its criteria, usually with a short summary containing the document's title and sometimes parts of the text. Most search engines support the use of the boolean operators AND, OR and NOT to further specify the search query. Some search engines provide an advanced feature called proximity search which allows users to define the distance between keywords.
The usefulness of a search engine depends on the relevance of the result set it gives back. While there may be millions of webpages that include a particular word or phrase, some pages may be more relevant, popular, or authoritative than others. Most search engines employ methods to rank the results to provide the "best" results first. How a search engine decides which pages are the best matches, and what order the results should be shown in, varies widely from one engine to another. The methods also change over time as Internet usage changes and new techniques evolve.
Most Web search engines are commercial ventures supported by advertising revenue and, as a result, some employ the practice of allowing advertisers to pay money to have their listings ranked higher in search results. Those search engines which do not accept money for their search engine results make money by running search related ads alongside the regular search engine results. The search engines make money every time someone clicks on one of these ads.
Revenue in the web search portals industry is projected to grow in 2008 by 13.4 percent, with broadband connections expected to rise by 15.1 percent. Between 2008 and 2012, industry revenue is projected to rise by 56 percent as Internet penetration still has some way to go to reach full saturation in American households. Furthermore, broadband services are projected to account for an ever increasing share of domestic Internet users, rising to 118.7 million by 2012, with an increasing share accounted for by fiber-optic and high speed cable lines.[8]
Google first launced in 1999. Google has one of the largest databases with blogs, wikis, and websites. Google also brings up PDF files that can be downloaded. Google will not support searches for airlines and searches are not case sensitive.
src="http://pagead2.googlesyndication.com/pagead/show_ads.js">
Sabtu, 22 November 2008
what is kubuntu?
satces
Kubuntu is an official derivation of the Ubuntu Linux desktop operating system providing KDE support. It is part of Ubuntu. All packages share the same archives as Ubuntu.
It is even possible to install KDE on Ubuntu to achieve a "Kubuntu". Otherway round Gnome can be installed on Kubuntu, so there's no real limitation according to the used desktop environment. Kubuntu comes with a preinstalled and preconfigured KDE and without Gnome.
From the Kubuntu page on the Ubuntu Wiki pages "The Kubuntu project aims to be to KDE what Ubuntu is to GNOME: a great integrated distro with all the great features of Ubuntu, but based on the KDE desktop. Kubuntu is released regularly and predictably; a new release is made with a release of a new KDE Version."
Ubuntu means "towards humanity" in Bemba.
Releases
The first Kubuntu release was published on April 8, 2005. It included KDE 3.4 and a selection of the most useful KDE programs not in KDE itself, including amaroK, Kaffeine and Gwenview. Both Live CDs and Install CDs for x86, PowerPC and AMD64 platforms are available. There are also daily builds of the CDs.
ubuntu 8.04
Ubuntu Desktop Edition
With Ubuntu Desktop Edition you can surf the web, read email, create documents and spreadsheets, edit images and much more. Ubuntu has a fast and easy graphical installer right on the Desktop CD. On a typical computer the installation should take you less than 25 minutes.
Desktop Tour
The fastest way to see Ubuntu is to take the tour
Desktop simplicity
When you start your system for the first time you'll see a desktop that is clean and tidy, no desktop icons, and a default theme that is easy on the eye.
Ubuntu 'Just Works'
We've done all the hard work for you. Once Ubuntu is installed, all the basics are in place so that your system will be immediately usable.
A complete office productivity suite
OpenOffice contains a user interface and feature set that is similar to other office suites, and includes all the key desktop applications you need, such as:
Word processor - for anything from writing a quick letter to producing an entire book. More »
Spreadsheet - a tool to calculate, analyse, and present your data in numerical reports or charts. More »
Presentation - an easy, and powerful tool for creating effective multimedia presentations. More »
Edit and share files in other formats
Easily open, edit and share files with your friends that have Microsoft Office, Word Perfect, KOffice or StarOffice.
Quick and easy updates
The task bar contains an update area where we'll notify you when there are updates available for your system, from simple security fixes to a complete version upgrade. The update facility enables you to keep your system up-to-date with just a few clicks of your mouse.
A vast library of free software
Need more software? Simply choose from thousands of software packages in the Ubuntu catalogue, all available to download and install at the click of a button. And it's all completely free!
Help and support
You'll be able to find help using the desktop browser or online. If you have a question about using Ubuntu, you can bet someone else has already asked it. Our community has developed a range of documentation that may contain the answer to your question, or give you ideas about where to look.
This is also where you'll get access to free support from the Ubuntu community in the chat and mailing lists in many languages. Alternatively, you can purchase professional support from the Canonical Global Support Services Team, or local providers.
Ubuntu in your local language
Ubuntu aims to be usable by as many people as possible, which is why we include the very best localisation and accessibility infrastructure that the free software community has to offer.
More Features »
You can download Ubuntu, or request a free CD from Canonical.
System Requirements
Ubuntu is available for PC, 64-Bit PC and Intel based Mac architectures. At least 256 MB of RAM is required to run the alternate install CD (384MB of RAM is required to use the live CD based installer). Install requires at least 4 GB of disk space.
Senin, 17 November 2008
devices
In 2007, L-com Connectivity Products, a leading manufacturer of cable assemblies, connectors, and other connectivity devices for over 25 years, acquired HyperLink Technologies of Boca Raton, FL. HyperLink Technologies is a high-quality manufacturer of antennas, amplifiers, and other wireless connectivity equipment. This acquisition strengthens L-com’s product offering, thereby creating a “one-stop source” for users of all connectivity equipment, wired or wireless.
This modem router uses Wireless G technology and a high-speed ADSL Modem to provide an all-in-one solution for connecting to the internet, e-mail and VoIP. Connect your PCs via the built-in Router and 4-port Switch to share the Internet throughout your household while advanced firewall and security features protect your PCs and your data. £39.99
Kamis, 13 November 2008
Computer Keyboard
$69.99
Advertisement ID : 749364
Ads Classification : For Sale
Location : Makati City, Metro Manila
Regular Price
Now Only
Save :
:
: P 300.00
P 200.00
P 100.00
Condition : 2nd Hand (Used)
Warranty : Personal Warranty
Computer Keyboard Reviews and Buying Guide:
The most basic component to a computer is a keyboard. Anyone who uses a computer eventually gets a "feel" for their keyboard and knows just how far away from their hands certain keys are. Good typers can type over 100 words a minute and 10-key experts can enter in numbers and digits faster than you think. Beyond the letters, numbers, and symbols you find on a computer keyboard, there are other countless functions and shortcuts you can do. Things like alt-tab, esc, shift, control, and the up, down and sideways arrows all help us navigate documents, webpages and software programs on a day to day basis. Many keyboards are slightly different leaving an end user learning a new routine if they switch jobs or computers.
Selasa, 11 November 2008
A virtual machine was originally defined by Popek and Goldberg as "an efficient, isolated duplicate of a real machine". Current use includes virtual machines which have no direct correspondence to any real hardware.[1]
Example: A program written in Java receives services from the Java Runtime Environment software by issuing commands from which the expected result is returned by the Java software. By providing these services to the program, the Java software is acting as a "virtual machine", taking the place of the operating system or hardware for which the program would ordinarily have had to have been specifically written.
Virtual machines are separated into two major categories, based on their use and degree of correspondence to any real machine. A system virtual machine provides a complete system platform which supports the execution of a complete operating system (OS). In contrast, a process virtual machine is designed to run a single program, which means that it supports a single process. An essential characteristic of a virtual machine is that the software running inside is limited to the resources and abstractions provided by the virtual machine -- it cannot break out of its virtual world.
[edit] System virtual machines
See also: Virtualization and Comparison of virtual machines
System virtual machines (sometimes called hardware virtual machines) allow the sharing of the underlying physical machine resources between different virtual machines, each running its own operating system. The software layer providing the virtualization is called a virtual machine monitor or hypervisor. A hypervisor can run on bare hardware (Type 1 or native VM) or on top of an operating system (Type 2 or hosted VM).
The main advantages of system VMs are:
* multiple OS environments can co-exist on the same computer, in strong isolation from each other
* the virtual machine can provide an instruction set architecture (ISA) that is somewhat different from that of the real machine
Multiple VMs each running their own operating system (called guest operating system) are frequently used in server consolidation, where different services that used to run on individual machines in order to avoid interference are instead run in separate VMs on the same physical machine. This use is frequently called quality-of-service isolation (QoS isolation).
The desire to run multiple operating systems was the original motivation for virtual machines, as it allowed time-sharing a single computer between several single-tasking OSes.
The guest OSes do not have to be all the same, making it possible to run different OSes on the same computer (e.g., Microsoft Windows and Linux, or older versions of an OS in order to support software that has not yet been ported to the latest version). The use of virtual machines to support different guest OSes is becoming popular in embedded systems; a typical use is to support a real-time operating system at the same time as a high-level OS such as Linux or Windows.
Another use is to sandbox an OS that is not trusted, possibly because it is a system under development. Virtual machines have other advantages for OS development, including better debugging access and faster reboots.[2]
Alternative techniques such as Solaris Zones provides a level of isolation within a single operating system. This does not have isolation as complete as a VM, as kernel exploits in a single zone affect all zones, whereas kernel exploits in a VM do not affect other VMs on the host. Zones are not virtual machines, but an example of "operating-system virtualization". This includes other "virtual environments" (also called "virtual servers") such as Virtuozzo, FreeBSD Jails, Linux-VServer, chroot jail, and OpenVZ. These provide some form of encapsulation of processes within an operating system. These technologies have the advantage of being more resource-efficient than full virtualization; the disadvantage is that they can only run a single operating system and a single version/patch level of that operating system - so, for example, they cannot be used to run two applications, one of which only supports a newer OS version and the other only supporting an older OS version on the same hardware.
[edit] Process virtual machine
See also: Virtualization and Comparison of application virtual machines
A process VM, sometimes called an application virtual machine, runs as a normal application inside an OS and supports a single process. It is created when that process is started and destroyed when it exits. Its purpose is to provide a platform-independent programming environment that abstracts away details of the underlying hardware or operating system, and allows a program to execute in the same way on any platform.
A process VM provides a high-level abstraction — that of a high-level programming language (compared to the low-level ISA abstraction of the system VM). Process VMs are implemented using an interpreter; performance comparable to compiled programming languages is achieved by the use of just-in-time compilation.
This type of VM has become popular with the Java programming language, which is implemented using the Java virtual machine. Another example is the .NET Framework, which runs on a VM called the Common Language Runtime.
A special case of process VMs are systems that abstract over the communication mechanisms of a (potentially heterogeneous) computer cluster. Such a VM does not consist of a single process, but one process per physical machine in the cluster. They are designed to ease the task of programming parallel applications by letting the programmer focus on algorithms rather than the communication mechanisms provided by the interconnect and the OS. They do not hide the fact that communication takes place, and as such do not attempt to present the cluster as a single parallel machine.
Unlike other process VMs, these systems do not provide a specific programming language, but are embedded in an existing language; typically such a system provides bindings for several languages (e.g., C and FORTRAN). Examples are PVM (Parallel Virtual Machine) and MPI (Message Passing Interface). They are not strictly virtual machines, as the applications running on top still have access to all OS services, and are therefore not confined to the system model provided by the "VM".
[edit] Techniques
[edit] Emulation of the underlying raw hardware (native execution)
VMware Workstation running Ubuntu, on Windows Vista
This approach is described as full virtualization of the hardware, and can be implemented using a Type 1 or Type 2 hypervisor. (A Type 1 hypervisor runs directly on the hardware; a Type 2 hypervisor runs on another operating system, such as Linux). Each virtual machine can run any operating system supported by the underlying hardware. Users can thus run two or more different "guest" operating systems simultaneously, in separate "private" virtual computers.
The pioneer system using this concept was IBM's CP-40, the first (1967) version of IBM's CP/CMS (1967-1972) and the precursor to IBM's VM family (1972-present). With the VM architecture, most users run a relatively simple interactive computing single-user operating system, CMS, as a "guest" on top of the VM control program (VM-CP). This approach kept the CMS design simple, as if it were running alone; the control program quietly provides multitasking and resource management services "behind the scenes". In addition to CMS, VM users can run any of the other IBM operating systems, such as MVS or z/OS. z/VM is the current version of VM, and is used to support hundreds or thousands of virtual machines on a given mainframe. Some installations use Linux for zSeries to run Web servers, where Linux runs as the operating system within many virtual machines.
Full virtualization is particularly helpful in operating system development, when experimental new code can be run at the same time as older, more stable, versions, each in a separate virtual machine. The process can even be recursive: IBM debugged new versions of its virtual machine operating system, VM, in a virtual machine running under an older version of VM, and even used this technique to simulate new hardware.[3]
The standard x86 processor architecture as used in modern PCs does not actually meet the Popek and Goldberg virtualization requirements. Notably, there is no execution mode where all sensitive machine instructions always trap, which would allow per-instruction virtualization.
Despite these limitations, several software packages have managed to provide virtualization on the x86 architecture, even though dynamic recompilation of privileged code, as first implemented by VMware, incurs some performance overhead as compared to a VM running on a natively virtualizable architecture such as the IBM System/370 or Motorola MC68020. By now, several other software packages such as Virtual PC, VirtualBox, Parallels Workstation and Virtual Iron manage to implement virtualization on x86 hardware.
Intel and AMD have introduced features to their x86 processors to enable virtualization in hardware.
[edit] Emulation of a non-native system
Virtual machines can also perform the role of an emulator, allowing software applications and operating systems written for another computer processor architecture to be run.
Some virtual machines emulate hardware that only exists as a detailed specification. For example:
* One of the first was the p-code machine specification, which allowed programmers to write Pascal programs that would run on any computer running virtual machine software that correctly implemented the specification.
* The specification of the Java virtual machine.
* The Common Language Infrastructure virtual machine at the heart of the Microsoft .NET initiative.
* Open Firmware allows plug-in hardware to include boot-time diagnostics, configuration code, and device drivers that will run on any kind of CPU.
This technique allows diverse computers to run any software written to that specification; only the virtual machine software itself must be written separately for each type of computer on which it runs.
[edit] Operating system-level virtualization
Operating System-level Virtualization is a server virtualization technology which virtualizes servers on an operating system (kernel) layer. It can be thought of as partitioning: a single physical server is sliced into multiple small partitions (otherwise called virtual environments (VE), virtual private servers (VPS), guests, zones, etc.); each such partition looks and feels like a real server, from the point of view of its users.
For example, Solaris Zones supports multiple guest OSes running under the same OS (such as Solaris 10). All guest OSes have to use the same kernel level and cannot run as different OS versions. Solaris Zones also requires that the host OS be a version of Solaris; other OSes from other manufacturers are not supported.[citation needed]
Another example is AIX, which provides the same technique under the name of Micro Partitioning.[citation needed]
The operating system level architecture has low overhead that helps to maximize efficient use of server resources. The virtualization introduces only a negligible overhead and allows running hundreds of virtual private servers on a single physical server. In contrast, approaches such as virtualisation (like VMware) and paravirtualization (like Xen or UML) cannot achieve such level of density, due to overhead of running multiple kernels. From the other side, operating system-level virtualization does not allow running different operating systems (i.e. different kernels), although different libraries, distributions etc. are possible.
Jumat, 07 November 2008
Berebut Taman Syurga
Madinah - 'Tempat yang ada antara rumahku dan mimbarku adalah Roudoh (taman) dari taman-taman surga'. Itulah petikan hadis Nabi yang membuat kaum muslimin berebut memasuki Roudoh. Mereka rela berdesakan karena Roudoh diyakini sebagai tempat yang mustajab untuk berdoa dan tempat yang nyaman untuk berdzikir.
Pantauan wartawan detikcom Muhammad Nur Hayid dari Madinah, Jumat (7/11/2008), tak kurang dari 100 ribu jamaah melaksanakan salat berjamaah di Masjid Nabawi tiap harinya. Para tamu Allah itu tak pernah putus asa berjuang memasuki Roudoh setelah mereka menunaikan salat jamaah.
Akibat banyaknya jamaah yang berebut ini, pihak keamanan masjid menambah personel penjaga pengamanan. Para penjaga yang berseragam cokelat ini tidak hanya berdiri di pintu-pintu masjid seperti yang biasanya dilakukan, tetapi juga ditempatkan di pintu-pintu masuk ke dalam Roudoh dan wilayah dalam Roudoh yang berbatasan langsung dengan makam Rasulullah SAW.
Tak hanya itu, pemerintah Arab saudi juga menambah kekuatan dari para khodimul haramain (pembantu Nasjid Nabawi) untuk mengawasi jamaah haji yang berbuat nyeleneh saat di Roudoh dan di makam Rasulullah.
Para petugas bersurban dan berjubah putih dengan rangkap baju seragam cokelat ini tak segan-segan menegur jamaah yang terlalu lama berdiam diri di dalam Roudoh. Mereka juga menegur jamaah haji yang dianggap melakukan tindakan menyekutukan Allah dengan mengagung-agungkan makam Rasulullah.
"Ya Haj, ini hanya makam Rasullah SAW. Kalau mau berdoa, hanya kepada Allah semata. Perintah syariat hanya mengucapkan salam pada Rasulullah dan Abu Bakar serta Umar Al Faruq. Itu saja. Tidak ada Al Fatihah, tidak ada doa-doa," kata salah seorang petugas dengan nada tingi saat melihat para jamaah asal India dan Turki mengusap-usapkan tasbihnya di pembatas makam Rasululah dan berdoa menghadap makam.
Untuk diketahui, Roudoh terletak di bagian kiri selatan masjid, kurang lebih 5 meter di belakang mihrab imam salat fardu Masjid Nabawi. Ciri yang paling gampang dikenali dari Roudoh adalah warna karpetnya. Warna karpet Roudoh putih kehijau-hijauan dan sangat berlainan dengan warna merah tua karpet Masjid Nabawi secara keseluruhan.
Kalau kita menghadap selatan, Roudoh terletak di sebelah kanan dari makam Rasulullah. Di dalamnya terdapat mihrab Nabi (dulu Rasulullah selalu menjadi imam salat di mihrab ini, dan sekarang masih dipakai saat salat Jumat), mimbar Utama (saat salat Jumat mimbar ini masih dipakai ceramah oleh khatib), dan tangga menuju tempat muadzin.
kutipan dari detik.com
computer gaming 2
USD$2,020.74
CAD$2,384.47
[AMD SpartanXHD CrossFireX Overclocked DX10 Gaming Computer]
AMD SpartanXHD CrossFireX Overclocked DX10 Gaming Computer
AMD Spider platform at its finest, this is the epitome of all AMD Gaming PC plaforms. With a pair of Powerful AMD/ATI Video Cards strapped together in full-speed 16x + 16x Crossfire in the latest AMD 790FX-based ASUS M3A32-MV ...
# AMD Phenom X4 9850 Black Edition Quad-core 2.5GHz x 4 (customizable)
# AMD/ATI Radeon HD4850 512MB (customizable)
# CrossFireX Technology
# 4GB of low-latency Dual-Channel DDR2 (customizable)
# Over half a terabyte of Storage (customizable)
# High-definition 7.1 on-board audio subsystem (customizable)
# ASUS Advanced Overclocking Features
# Dual Cold Cathode Light (customizable)
# Aftermarket Heatsink (customizable)
# Blu-Ray/HD DVD player (option)
CAD$2,384.47
[AMD SpartanXHD CrossFireX Overclocked DX10 Gaming Computer]
AMD SpartanXHD CrossFireX Overclocked DX10 Gaming Computer
AMD Spider platform at its finest, this is the epitome of all AMD Gaming PC plaforms. With a pair of Powerful AMD/ATI Video Cards strapped together in full-speed 16x + 16x Crossfire in the latest AMD 790FX-based ASUS M3A32-MV ...
# AMD Phenom X4 9850 Black Edition Quad-core 2.5GHz x 4 (customizable)
# AMD/ATI Radeon HD4850 512MB (customizable)
# CrossFireX Technology
# 4GB of low-latency Dual-Channel DDR2 (customizable)
# Over half a terabyte of Storage (customizable)
# High-definition 7.1 on-board audio subsystem (customizable)
# ASUS Advanced Overclocking Features
# Dual Cold Cathode Light (customizable)
# Aftermarket Heatsink (customizable)
# Blu-Ray/HD DVD player (option)
computer gaming
USD$1,522.09
CAD$1,796.07
[AMD WhisperHD SLI Performance Desktop / DX10 Gaming Computer]
AMD WhisperHD SLI Performance Desktop / DX10 Gaming Computer
Based around AMD Athlon 64 X2/X4 dual or quad-core processor, this fully-customizable computer system supports SLI technology for double the graphics power. You can choose to have this system arrive at your home with ...
# AMD Athlon 64 X2 6000+ (3.0 GHz x 2) (customizable)
# AMD Phenom 64 (option)
# NVIDIA GeForce 9600 GT 512MB (customizable)
# 4 GB of Dual Channel DDR2 800MHz
# Gigabit LAN
# High-definition 7.1 on-board audio subsystem (customizable)
# ASUS Advanced Overclocking Features
# Dual Cold Cathode Light (customizable)
# Aftermarket Heatsink (customizable)
# Blu-Ray/HD DVD player (option)
laptop
USD$2,507.00/CAD$2,958.26
Custom SLI DTR Laptop
[Custom SLI DTR Laptop]
This custom laptop is a wonder of modern technology. It is a nitch product that is designed for folks that demand Desktop performance but require portability. The laptop comes with Core Duo (up to Core 2 Extreme X6800) desktop processor, up to 4 GB of DDR 6400 (800mz) ram and QUADRO FX 1600M w/512MB (upgradable up to Dual GeForce 8800 GTX in SLI). The real draw of this unit is the SLi video cards. You can choose this system to arrive at your home already with 2 Video cards install in SLI mode. The unit comes also with WiFi G, bluetooth, a built in camera, and a slot for a TV tuner. Check out our custom options for this laptop - everything ranging from processor to operating system is customizable.
[Custom SLI DTR Laptop] [Custom SLI DTR Laptop] [Custom SLI DTR Laptop] [Custom SLI DTR Laptop] [Custom SLI DTR Laptop]
sound
Product Sales
In addition to our hire services, we offer a range of products for sale. We are the authorised distributor of Bose Professional Sound Products for the Waikato and are able to assist from basic design concepts through to supply and installation.
We also supply the complete range of consumables including Gaffer Tape, Lamps, Smoke Fluid, Haze Fluid, Batteries, Neutrik Connectors, Cable and Leads.
Musicare is a retailer of Australian made Screen Technics projection screens, AV lifting devices and Flat Panel brackets.
Audio Products
Lighting Products
Complete Systems
Great audio is heard and not seen.
The world if filled with sound. In shops the right music can increase sales, in the boardroom the correct sound system delivers your message effectively every time. A professionally designed and installed sound system can make a difference to your business. With the increased use of multimedia presentations people often overlook the audio aspects of their installation.
As Waikato distributors of BOSE Professional Sound Products we have access to the best audio systems in the world. For more than three decades BOSE have been at the forefront of audio technology, with innovative designs and superior product quality.
As each system is tailored to suit your needs, please call or email for more information about our range of audio products.
As lighting professionals we have the experience to supply you with the correct product for your needs.
Our customers range from ‘working’ DJ’s and entertainers through to promotions companies and exhibition organisers. We provide the right professional equipment to customers, products that allow them to work properly and deliver their service correctly.
We work with private companies, government organisations, local authorities, schools, colleges, nightclubs, pubs, restaurants, churches, function rooms in fact anyone who needs effective lighting in their venue or building (or outside if required).
If you want to come in and discuss your lighting needs please give us a call and arrange for a time so we can dedicate the time to giving you the level of service you require.
It is vary rare that customers only come to us for lighting or audio products separately. Our ability to provide a complete service under one roof is why Musicare has grown to be the Waikato’s leading Sound and Lighting provider to all types of business and organisations.
With years of experience and many satisfied customers we are able to help you with your project.
To us you are not ‘just another restaurant’ or ‘a school hall project’, we work with you to establish your needs, examine your location and recommend the correct products to suit your needs not only right now, but with an eye on future needs as well.
To discuss your needs please feel fee to call or email our office.
buy monitor
Introduction
Monitor Buying Guide graphic
Everyone needs a good monitor to get the most out of a PC. But which monitor you need depends on several factors--what applications you use, how much room you have on your desk, how much space you need on a virtual desktop, and of course how much you want to spend. From standard-issue 19-inchers to 24-inch monsters, here's how to sort out what you need.
The Big Picture
How is an LCD different from yesterday's CRT? We'll explain the advantages of LCDs, and tell you which monitors work best for what you do. more
The Specs Explained
We'll help you sort through the litany of options for LCD monitors. more
Monitor Shopping Tips
Ready to buy? Here's what to look for and what to avoid. more
The Big Picture
If you've replaced an old PC in the last few years, you may have kept your old monitor to use with the new machine. That's okay if it's in good shape--most monitors have a life span of about five years--but if it's a worn-out 15-inch CRT that produces barely legible text at 800 by 600 pixels, you're hobbling your productivity.
Most monitor manufacturers offer entry-level LCD models that combine very low prices with pared-down features. These monitors work well enough for Web surfing, e-mail, and other office tasks--as long as they provide adequate resolution and screen adjustment controls for brightness, color, and other settings. Midrange and professional lines often provide better image quality and extensive features, such as superior image-adjusting controls, USB ports (make sure you get a monitor with USB 2.0 ports--some models with USB 1.1 hubs are still on the market), a larger set of ergonomic options (such as height adjustment), and higher resolutions. Some professional-level monitors include asset control--to help IS managers keep track of their company's property via a LAN--and hardware calibration, which adjusts the monitor and/or graphics card to ensure precise hues. (Third-party calibration packages are also available.)
CRT vs. LCD
Historically, graphics professionals have preferred CRT monitors because they support a greater range of resolutions (including very high resolutions) and show truer colors and greater nuance in color. However, manufacturers ceased making the aperture-grille models--generally agreed to be the top-performing type of CRT for photos and general graphics work--in 2005. Many pros now use high-end LCDs, which approach the color quality of CRTs yet consume half as much power or less. The development of color-calibrating hardware and software specifically designed for LCDs has helped persuade many professionals to make the switch to flat panels. Promises of improvements in black level (perfect black--which is traditionally somewhat soft or grayed in LCDs) and a wider color gamut should make this transition still easier in the near future. Another bonus: The greater brightness of LCDs also frees graphics pros from the confines of their darkened studios.
People who work mostly with text have always gravitated toward LCDs because pixels on an LCD have well-defined edges, resulting in sharply focused letters. Some gamers still prefer CRTs because LCDs redraw their screens more slowly, which can produce blurring and motion artifacts in moving images. However, response time--the spec that governs image motion in LCDs--continues to drop, minimizing the ill effects. Modern LCDs can refresh quickly enough to make them game-worthy for most users.
Budget constrictions may still drive some buyers to CRTs, which usually cost less than LCDs that have a comparably sized viewable screen area. As LCD prices fall, though, more users and companies are going for the slim form and low power usage of the LCD. So from this point on, this guide will discuss LCDs only.
Key Features
Native resolution: Because an LCD uses a matrix of pixels to display its image, it has a fixed (or native) resolution at which the display looks best. A 15-inch LCD has a native resolution of 1024 by 768, while most 17-, 18-, or 19-inch models use a 1280 by 1024 native resolution. Wide-screen 23- and 24-inch wide-screen models usually have a native resolution of 1920 by 1200, and 30-inch wide-screens have a resolution of 2560 by 1600. If you are using Windows XP or earlier and set the monitor to a lower-than-native resolution--to upsize very small text, for instance--the image will almost certainly be less defined, because the display will use only a portion of the pixels it contains and will scale up the resulting image to fill the screen. Keep in mind that you can never exceed the native resolution of an LCD monitor. So, for example, you will not be able to display 1600-by-1200 resolution on an LCD with a native resolution of 1280 by 1024.
Though scaling technologies have improved in recent years, you're still likely to be disappointed with their results. On Windows XP and earlier, text most readily shows pronounced jaggedness at nonnative resolutions. Windows Vista may reduce this effect with its vector graphics, but it's still advisable to use a monitor at its native resolution. So a particular LCD is a good choice if its native resolution is one you are comfortable using for all applications. In the PC World Test Center, all monitors are tested at native resolution.
Aspect ratio: Most LCDs have a screen aspect ratio of about 3:4, much like a regular-format TV. However, wide-screen monitors have an aspect ratio closer to the 16:9 aspect ratio of HDTVs. The wide-screen format becomes useful for working in large spreadsheets or in programs that contain many toolbars or palettes. It's appealing for watching DVDs as well, although the image quality may not be as good as on a TV. Many users see a wide-screen monitor as an upgrade from smaller dual monitors. A dual-monitor setup is usually the less expensive proposition.
One important thing to keep in mind: The screen size is measured diagonally, and the area of a wide-screen monitor's display is smaller than that of a regular-format display of the same size. In other words, a 21-inch wide-screen monitor shows about as many pixels as you'd expect from a regular-format 19-inch monitor. In the days of CRT monitors, vendors would state a tube size, say 21-inch, but the diagonal of the viewable screen would be from one to two inches less. With LCDs, the stated diagonal is always the true measurement from the viewable screen.
Viewing angle: Measured in degrees, an LCD's viewing angle indicates how far you can move to the side (or above, or below) from the center of the display before the image quality deteriorates to unacceptable levels. No matter what size monitor you use, a wide viewing angle becomes increasingly important the more you care about getting accurate, consistent colors for design work or for tweaking digital photos. Each vendor determines its own criteria for this, as no industry-standard method has been established for measuring viewing angle. As a result, the numbers may not be comparable from one vendor to another, but they can indicate relative performance among models from the same company.
The best way to judge viewing angle is to see the monitor for yourself, but you can eliminate some models from consideration if even their vendor-reported viewing angles are below a certain value. The larger the monitor, the more important a wide viewing angle is. On monitors measuring 17 inches or more, the edge of the screen is at a greater angle to someone sitting directly in front of its center, and people are more likely to be able to share the monitor when working or giving a group presentation. A viewing angle of at least 150 degrees is advisable for these monitors.
The choice of panel technology affects the viewable angle. Some LCDs use twisted nematic panels, which have small viewing angles. On a TN screen, brightness drops and colors change as you move to the side or up and down. This can mean that your work will look different if you adjust your chair or your posture. It also makes it difficult to share your work with someone who sits next to you (although a physical swivel adjustment can help with this). A few years ago, TN panels appeared to be on their way out, but recent interest in TN's fast pixel-response times has brought more new models to the market. Other panel types, such as in-plane switching, multidomain vertical alignment, and patterned vertical alignment, have wider angles of view than TN panels can offer.
Contrast ratio: This term refers to the difference in light intensity between the brightest white and the darkest black that an LCD can produce. Look for a contrast ratio of 400:1 or better--with anything lower, colors may wash out when you turn up the brightness and may disappear when you turn it down. However, higher is only better up to a point. Contrast ratios over 600:1 are unlikely to provide any advantage, and monitor vendors are likely using "fuzzy math" to calculate those values, anyway.
PC World Test Center evaluations of 15 LCD units in May 2003 showed that in some cases, actual contrast ratios were below the vendors' specifications--by as much as 50 percent. Most companies actually erred in the opposite direction, with a published contrast ratio that fell below our measured result--something that most people probably would not mind. But since shoppers have no way to tell if the manufacturer has overstated or understated contrast ratio, the specification is essentially useless for comparison purposes. When shopping for an LCD, your best bets are to check independent reviews such as those in PC World's Reviews and Rankings section and to trust your own eyes.
Brightness: Expressed as candelas per square meter (cd/m2) or nits, this specification measures the greatest amount of light that comes from a screen displaying pure white. Nearly all LCDs have a brightness level of 250cd/m2 or greater, which should be more than sufficient. (In comparison, CRT monitors typically average about 100 cd/m2--though you might see some high-brightness CRTs.) Vendors usually set the brightness level to maximum on new monitors to impress customers. High brightness can be eye-catching for video and graphics, but it can be uncomfortable over time, particularly for text viewing--and it may cause certain photographic nuances to wash out. After using the monitor for a while, you will likely want to turn the brightness down a bit to spare your eyes. Many monitors offer screen modes that change the brightness (and sometimes color and other characteristics) to make certain types of content look best.
Digital versus analog: If you have a graphics card with digital video-out--and if your computer is less than two years old, you probably do--choose an LCD that has DVI digital input. The image won't have to convert from analog to digital and back again, so it will be clearer. Even if you don't have a DVI port on your system, choosing a digital LCD makes sense, because your next desktop PC probably will have a DVI port--and most digital-capable monitors also have a VGA (analog) connection. Digital inputs tend to be found on more-expensive LCDs. Very few notebook PCs come with digital outputs for external monitors. However, some notebooks can gain a DVI connection when they attach to a docking station or port replicator. (Note that there are two types of DVI connections found on typical LCD monitors: DVI-D and DVI-I. DVI-D is a digital-only port; DVD-I can accept either an analog or a digital input. You'll need a special connector to hook up to your PC's VGA analog port, however.) DVI-I obviously provides greater flexibility.
Special inputs: As users do more video and photo editing at their PCs--and as more watch DVDs on them--more monitors offer inputs we used to see only on TVs or peripherals. Photographers and videographers may be interested in S-Video ports and memory card slots; DVD aficionados may want to keep their eyes peeled for monitors with component and/or composite inputs.
Response time: Pixel response time governs the time (measured in milliseconds) required for a pixel to change. In theory, a low response time signifies an LCD with minimal motion artifacts in moving images. This spec is especially important to video watchers and gamers.
There are two main types of LCD response time. Rise-and-fall response time measures the time it takes a pixel to turn from black to white (rise) and back to black (fall). Gray-to-gray response time measures the time it takes for a pixel to change from one shade of gray to another. Each type has its uses.
Rise-and-fall response time has been clearly defined and has been the industry standard for years. As of yet, no such definition for gray-to-gray response time exists. In theory, gray-to-gray response time could be a useful spec, since it can measure the time required to switch between shades (as opposed to black and white). This should make it useful for indicating how an LCD will look showing the subtle shades of movies and games. However, the lack of an agreed-upon definition means vendors may use different ways of determining the spec. In short, response time specs are not always comparable from vendor to vendor. For more information on this issue, see "LCD Specs: Not So Swift."
Size: Though it may seem obvious, bear in mind the size of your workspace when deciding on the type of monitor to buy. A huge monitor may look appealing, but you want to make sure your desk is deep enough to let you view it from a comfortable distance. Just as you would with a television, you want to sit at a distance of about two times the diagonal measurement from the screen.
Physical adjustments: Almost all monitors come with tilt adjustment. If you spend a great deal of time in front of your monitor, you may want to find one that lets you adjust the height of the screen as well. You may find that it's worth a few extra dollars to get a monitor that will keep the screen at a comfortable height instead of making your neck do all the work. A monitor with side-to-side swivel adjustment makes it easier to show your screen to a nearby customer or coworker. Finally, if you need to see view anything that's longer than it is tall--a full-page document, a long Web page, or a screen full of e-mail--you could get a lot of use out of a screen pivot function. Just bear in mind that not every monitor with a pivoting screen includes image pivoting software; you'll need that to make your screen adjust to portrait mode.
The Specs Explained
As with most PC peripherals, monitors introduce you to a ton of unfamiliar specs. While price or specifications alone shouldn't determine what you buy--what you'll use it for is important as well, and image quality is the most important thing to most users--here are some things to look for to narrow your search.
Flat-Panel LCD Displays: Features and Specifications Guide
Important: Native resolution. Images look best when displayed at an LCD's native resolution. You can go lower (and in some cases higher), but the image may appear blurry. The vector graphics of Windows Vista may lessen this, but native resolution will always look sharpest. Some models are better than others at handling non-native resolutions. (Remember that with LCDs the native resolution is the maximum resolution you can display.)
Important: Panel size. Unlike CRT (which indicates both tube-size and viewable-screen diagonals, LCD panel size indicates viewable size as well. As with CRTs, the measurement is made diagonally from one corner of the screen to the opposite corner. Too small a panel, and you'll have trouble cramming everything you need to see on your screen; too large, and may have to crane your neck.
Important: Physical adjustments. Height adjustment lets you adjust your monitor to a comfortable physical level. Swivel is useful for sharing your work, and pivot is handy for viewing applications that are taller than they are wide.
Somewhat important: Contrast ratio. Contrast ratio can help you determine how rich the color will be in on-screen images. A higher ratio is better, but vendor specifications are not always accurate.
Somewhat important: Viewing angle. Indicates how far you can move to the side of (or above and below) the center of the screen and still see what's displayed. This is important when you use the LCD to make presentations, or when you work with another person. Vendors use different methods to measure viewing angles, so make the final judgment yourself by visual comparison.
Somewhat important: Brightness. All LCDs generally provide more than enough brightness. In fact, most users find they have to turn the monitor's brightness down after purchasing.
Minor: Response time. Rise-and-fall response time Iindicates the time required for a pixel to change from black to white (rise) and back to black (fall). A low figure in milliseconds should indicate a screen that will not display only minimal motion artifacts in moving images during games or video. Gray-to-gray response time does not have a standard definition, and is a less reliable indicator.
Monitor Shopping Tips
Now that you know what specifications are available, read PC World's recommendations before you start shopping.
General Monitor Buying Tips
Try before you buy. When it comes to choosing the monitor you will be staring at for the next few years, only your eyes can tell you if a monitor's image quality, resolution, and size are right for you. Don't buy displays over the Web or by mail order unless the seller has an unconditional return policy and, ideally, no restocking fee. Checking out models in a store can be helpful, but keep in mind that they are often hooked up to low-quality video signals and placed under different lighting from what you have in your office or home. If possible, try to find a vendor with a liberal return policy, so you can try the monitor in your own setting before committing to the purchase.
Check screen real estate. Make sure you have enough screen for what you need to do. Remember that the viewable area of a wide-screen monitor is generally comparable to the viewable area of a regular-format monitor that's 2 inches smaller. Similarly, the viewable size of a CRT is an inch or two smaller than the advertised tube size--so if you're switching from a CRT to an LCD, you may not need as big a monitor as you think. Also bear in mind that if you're switching from an LCD with a regular aspect ratio to a wide-screen one, the wide-screen will have less real estate at the same diagonal measurement. A 19-inch wide-screen is comparable to a 17-inch regular-format LCD. The current sweet spots for display size are the 19-inch regular-format LCD and the 20-to-22-inch wide-screen LCD, both of which provide plenty of desktop space for most users.
Gain more screen space by using two monitors. Consider using multiple smaller monitors instead of one big one. With the right video card, you can run both simultaneously off the same PC. A pair of 17-inch LCDs will let you do video or image editing in one window, and word processing or Web browsing in the other. This can be a great way to get more use out of old monitors. If the double footprint gives you pause, consider mounting two small LCDs on a stand. Look for monitors with good screen quality and the VESA Flat Panel Mounting Interface and an FPMI-compatible stand.
Consider USB ports. Universal Serial Bus connections are designed for quick and easy attachment of numerous peripherals. When USB debuted, the physical accessibility of a monitor made it a natural choice for housing a number of the new, smart, hot-pluggable ports (although the inclusion of USB adds to a monitor's cost). The number of ports provided varies with different models, as does the number of ports that are up front versus on the back. Current monitors are likely to include USB 2.0 hubs. USB 1.1 is fast enough for hooking up lower-performance devices, such as keyboards, mice, and even broadband modems. USB 2.0 devices, such as CD-RW drives and hard drives, will work with USB 1.1 ports, but at lower speeds than with USB 2.0 ports.
Decide whether you want speakers. The inclusion of speakers in a monitor can be a nice way to save space on your desktop. But despite recent advances, their sound will rarely satisfy the discerning ear. If you're picky about sound quality, save the money for a nice set of speakers with a subwoofer.
Donate or Recycle Your Old Monitor
Never, ever send your old monitor to the dump. Recycling is not only good for the environment, but it's also a legal requirement in some states that will not accept monitors in regular municipal landfills. CRTs contain 4 to 6 pounds of lead plus other toxic materials that could leach into the soil and water in minute quantities if not properly disposed of. At this point, most LCDs contain lead and nearly all contain mercury.
As long as your old monitor is still working, it could be a boon to someone else. Check local listings for charities that accept computer equipment. If you'd like to help a lucky individual, join your local Freecycle group or post a notice in the Free section of the For Sale category of Craigslist.org in your region. If you're sure your old monitor is no good to anybody any more, check with the vendor or your local government to find recyclers in your area that can handle monitors.
Check out these stories for advice on recycling monitors and other electronic devices: "Tips & Tweaks: Recycle PCs, Notebooks, and Components" and "Easy Ways to Recycle Old PCs and Cell Phones--Really!."
Rabu, 05 November 2008
printer
Take That, Stupid Printer!How to fight back against the lying, infuriating, evil ink-and-toner cabal.
By Farhad ManjooPosted Thursday, Aug. 21, 2008, at 3:21 PM ET
The Brother HL-2040 printer.The Brother HL-2040 printerI bought a cheap laser printer a couple years ago, and for a while, it worked perfectly. The printer, a Brother HL-2040, was fast, quiet, and produced sheet after sheet of top-quality prints—until one day last year, when it suddenly stopped working. I consulted the user manual and discovered that the printer thought its toner cartridge was empty. It refused to print a thing until I replaced the cartridge. But I'm a toner miser: For as long as I've been using laser printers, it's been my policy to switch to a new cartridge at the last possible moment, when my printouts get as faint as archival copies of the Declaration of Independence. But my printer's pages hadn't been fading at all. Did it really need new toner—or was my printer lying to me?
Print This ArticlePRINTDiscuss in the FrayDISCUSSEmail to a FriendE-MAILGet Slate RSS FeedsRSSShare This ArticleRECOMMEND...View as single pageVIEW AS SINGLE PAGE
Yahoo! BuzzFacebook FacebookPost to MySpace!MySpaceMixx MixxDigg DiggReddit RedditDel.icio.us del.icio.usFurl FurlMa.gnolia.com Ma.gnoliaSphere SphereStumble UponStumbleUponCLOSE
To find out, I did what I normally do when I'm trying to save $60: I Googled. Eventually I came upon a note on FixYourOwnPrinter.com posted by a fellow calling himself OppressedPrinterUser. This guy had also suspected that his Brother was lying to him, and he'd discovered a way to force it to fess up. Brother's toner cartridges have a sensor built into them; OppressedPrinterUser found that covering the sensor with a small piece of dark electrical tape tricked the printer into thinking he'd installed a new cartridge. I followed his instructions, and my printer began to work. At least eight months have passed. I've printed hundreds of pages since, and the text still hasn't begun to fade. On FixYourOwnPrinter.com, many Brother owners have written in to thank OppressedPrinterUser for his hack. One guy says that after covering the sensor, he printed 1,800 more pages before his toner finally ran out.
Brother isn't the only company whose printers quit while they've still got life in them. Because the industry operates on a classic razor-and-blades business model—the printer itself isn't pricy, but ink and toner refills cost an exorbitant amount—printer manufacturers have a huge incentive to get you to replace your cartridges quickly. One way they do so is through technology: Rather than printing ever-fainter pages, many brands of printers—like my Brother—are outfitted with sensors or software that try to predict when they'll run out of ink. Often, though, the printer's guess is off; all over the Web, people report that their printers die before their time.
Enter OppressedPrinterUser. Indeed, instructions for fooling different laser printers into thinking you've installed a new cartridge are easy to come by. People are even trying to sell such advice on eBay. If you're at all skilled at searching the Web, you can probably find out how to do it for free, though. Just Google some combination of your printer's model number and the words toner, override, cheap, and perhaps lying bastards.
Similar search terms led me to find that many Hewlett-Packard printers can be brought back to life by digging deep into their onboard menus and pressing certain combinations of buttons. (HP buries these commands in the darkest recesses of its instruction manuals—see Page 163 of this PDF.) Some Canon models seem to respond well to shutting the printer off for a while; apparently, this resets the system's status indicator. If you can't find specific instructions for your model, there are some catchall methods: Try removing your toner cartridge and leaving the toner bay open for 15 or 20 seconds—the printer's software might take that as a cue that you've installed a new cartridge. Vigorously shaking a laser toner cartridge also gets good results; it breaks up clumps of ink and bathes the internal sensor in toner.
Share this article on Digg
Buzz up!
Share this article on Buzz
These tricks generally apply to laser printers. It's more difficult to find ways to override ink-level sensors in an inkjet printer, and, at least according to printer manufactures, doing so is more dangerous. I was able to dig up instructions for getting around HP inkjets' shut-off, and one blogger found that coloring in his Brother inkjet cartridge with a Sharpie got it to print again. But I had no luck for Epson, Lexmark, Canon, and many other brands of inkjets. There are two reasons manufacturers make it more difficult for you to keep printing after your inkjet thinks it's out of ink. First, using an inkjet cartridge that's actually empty could overheat your printer's permanent print head, leaving you with a useless hunk of plastic. Second, the economics of the inkjet business are even more punishing than those of the laser business, with manufacturers making much more on ink supplies than they do on printers.
Inkjet makers have a lot riding on your regular purchases of ink—and they go to great lengths to protect that market. In 2003, the British consumer magazine Which? found that inkjet printers ask for a refill long before their cartridges actually go dry. After overriding internal warnings, a researcher was able to print 38 percent more pages on an Epson printer that had claimed it didn't have a drop left. Lawyers in California and New York filed a class-action lawsuit against Epson; the company denied any wrongdoing, but it settled the suit in 2006, giving customers a $45 credit. A similar suit is pending against Hewlett-Packard.
There's also a long-standing war between printer makers and third-party cartridge companies that sell cheap knockoff ink packs. In 2003, Lexmark claimed that a company that managed to reverse-engineer the software embedded in its printer cartridges was violating copyright law. Opponents of overbearing copyright protections were alarmed at Lexmark's reach; copyright protections have traditionally covered intellectual property like music and movies, not physical property like printer cartridges. A federal appeals court dismissed Lexmark's case, but manufacturers have recently been successful in using patent law to close down third-party cartridge companies.
In the long run, though, the printer companies' strong line against cartridge makers seems destined to fail. Buying ink and toner is an enormous drag. Having to do it often, and at terribly steep prices, breeds resentment—made all the worse by my printer's lying ways. Some companies are realizing this. When Kodak introduced a new line of printers last year, it emphasized its low ink costs. Kodak claims that its cartridges last twice as long as those of other printers and sell for just $10 to $15 each, a fraction of the price of other companies' ink. When my Brother finally runs dry, perhaps I won't replace the toner—I'll replace the printer.
Selasa, 28 Oktober 2008
Small Packages For .Net Micro Framework
William Wong | ED Online ID #19876 | October 7, 2008
With the latest installment in beta testing, Microsoft’s .NET Micro Framework is into its third incarnation. The production version is 2.5. It targets more compact platforms like those from SJJ Micro’s EDK (Fig. 1) and Design Solutions’ Tahoe (Fig. 2). The EDK runs an ARM9-based chip from Cirrus Logic and the Tahoe uses Freescale’s i.MXS core. Both provide a similar development environment for .NET Micro Framework applications.
To start, I took a look at the hardware from both companies and then Microsoft’s software. The development experience for both platforms was similar, although they have a different hardware complement. Both come with cables, power supplies, and software including customization for the Microsoft .NET Micro Framework SDK for the respective hardware platforms.
SJJ Micro
The EDK board has the same dimensions as a PC/104 board of 3.77 in. by 3.54 in. It houses a 200-MHz Cirrus Logic EP9302 ARM9 processor. The board also has 8 Mbytes of flash, 8 Mbytes of SDRAM and a battery-backed real-time clock. Removable storage is provided via an MMC/SD hot-swap socket.
The EP9302 has a pair of serial ports, one of which is used for development. There is also a pair of USB 2.0 host ports, 10/100 BaseT Ethernet, SPI/I2S, a 5-channel 12-bit analog-to-digital converter (ADC), 16 GPIO lines, 16 PLD 5-V inputs, 16 PLD outputs (eight with high drive capability) and two pulse-width modulators (PWMs).
The form factor is nice if you happen to have the right case handy, but most tend to sit on the lab bench with cables connected to them.
Design Solutions Tahoe
Design Solutions’ Tahoe is built around their Meridian CPU surface-mount module (35 mm by 35 mm) found on the bottom of the board. The module contains a 100-MHz ARM920T-based Freescale i.MXS processor. The module adds 4 Mbytes of flash and 8 Mbytes of SDRAM. The module approach is easier for deployment and allows a simpler host board to be used.
The Tahoe host board contains a QVGA display. It also brings out 32 GPIOs on headers for expansion that can include modules available Design Solutions including an Ethernet interface. Other available peripherals include 2 UARTs, an I2C port, an SPI port, and a USB port. Power can be supplied to the board using a 5-V adapter or the USB port.
Design Solutions would like to sell lots of modules as the Tahoe is for demonstrations and development. The use of a module can greatly simplify a target design and it eliminates much of the more complex board design necessary for today’s micros. The memory complement should be about right for most applications that the processor can handle.
Microsoft .NET Micro Framework Tools
Those expecting a stripped down version of Windows CE will be disappointed. There is just not enough horsepower on most of the targets for .NET Micro Framework to handle something that large. On the other hand, those looking for support for tools like C# will be pleasantly surprised by the compatibility and support that Microsoft delivers.
As with most Microsoft platforms, board vendors deliver a board support package (BSP) that mates with the standard development tools and software that Microsoft delivers. This includes Visual Studio and the .NET Micro Framework SDK. These can be downloaded from Microsoft’s site. The SDK is free and 90-day versions of Visual Studio 2008 are available as well.
Each development kit includes documentation on integrating these tools with the BSP. The process is not trivial but straight forward and is essentially the same for most platforms.
The interface to the two boards is via a serial port. The Tahoe board has a JTAG interface but you need the matching JTAG hardware. This is faster and more effective, but the serial port is adequate for testing the board and for many applications. There is a JTAG and PLD program header for the SJJ EDK board as well.
The .NET Micro Framework is a subset of a subset with .NET sitting on the Windows platforms and the .NET Compact Framework sitting on Windows CE platforms. All include the CLR (common language runtime) that allows all platforms to run a range of languages including C# and VB.NET. Applications written in these languages, and any other compatible languages, will run on the two boards assuming the other runtime components and the application fit within the available memory.
The .NET Micro Framework actually runs a small version of the CLR called TinyCLR. TinyCLR provides core services including memory management, thread management, exception handling and debug services. It handles managed code generated by .NET compilers such as C#.
The systems use a TinyBooter boot loader to bring up the CLR and then the application. It can handle downloading of application code as well as interaction with the Windows host development system running Visual Studio 2008.
The main limitation of both systems is the serial interface. It is workable, but slow, and most developers will want to move to JTAG if they can. The Ethernet interface on the SJJ board is in development—so that high speed link is not available as of yet; and both it and the Ethernet plug-in from Device Solutions are designed for application use. The MMC/SD card support is available from DotVision.com. I was not able to test that out, unfortunately. Everything was setup as Visual Studio projects to it was a simple matter to generate the applications. Direct IO is possible using interface objects like InputPort and the complete documentation for the boards and chips is included.
Keep in mind that the .NET Micro Framework is not a real-time system unless it is placed on top of an RTOS. It provides scheduling services and worked quite well for a range of basic applications as is. It can run on top of RTOSes such as Express Logic’s ThreadX, but these will be separate purchases.
Overall, the process is not much different than using C or C++ to do development. The modularity of .NET is nice but the benefits are more likely to come from a common development platform that supports a wider range of programming languages such as Iron Python. Likewise, there is a level of compatibility between platforms that run .NET Micro Framework and even the higher level .NET environments that only similar platforms such as Java provide. Still, as you get closer to the hardware, this portability becomes more limited. In the case of most of the sample applications provided with the kits, the portability is almost nil since they rely on hardware and support within the BSP to run.
With the latest installment in beta testing, Microsoft’s .NET Micro Framework is into its third incarnation. The production version is 2.5. It targets more compact platforms like those from SJJ Micro’s EDK (Fig. 1) and Design Solutions’ Tahoe (Fig. 2). The EDK runs an ARM9-based chip from Cirrus Logic and the Tahoe uses Freescale’s i.MXS core. Both provide a similar development environment for .NET Micro Framework applications.
To start, I took a look at the hardware from both companies and then Microsoft’s software. The development experience for both platforms was similar, although they have a different hardware complement. Both come with cables, power supplies, and software including customization for the Microsoft .NET Micro Framework SDK for the respective hardware platforms.
SJJ Micro
The EDK board has the same dimensions as a PC/104 board of 3.77 in. by 3.54 in. It houses a 200-MHz Cirrus Logic EP9302 ARM9 processor. The board also has 8 Mbytes of flash, 8 Mbytes of SDRAM and a battery-backed real-time clock. Removable storage is provided via an MMC/SD hot-swap socket.
The EP9302 has a pair of serial ports, one of which is used for development. There is also a pair of USB 2.0 host ports, 10/100 BaseT Ethernet, SPI/I2S, a 5-channel 12-bit analog-to-digital converter (ADC), 16 GPIO lines, 16 PLD 5-V inputs, 16 PLD outputs (eight with high drive capability) and two pulse-width modulators (PWMs).
The form factor is nice if you happen to have the right case handy, but most tend to sit on the lab bench with cables connected to them.
Design Solutions Tahoe
Design Solutions’ Tahoe is built around their Meridian CPU surface-mount module (35 mm by 35 mm) found on the bottom of the board. The module contains a 100-MHz ARM920T-based Freescale i.MXS processor. The module adds 4 Mbytes of flash and 8 Mbytes of SDRAM. The module approach is easier for deployment and allows a simpler host board to be used.
The Tahoe host board contains a QVGA display. It also brings out 32 GPIOs on headers for expansion that can include modules available Design Solutions including an Ethernet interface. Other available peripherals include 2 UARTs, an I2C port, an SPI port, and a USB port. Power can be supplied to the board using a 5-V adapter or the USB port.
Design Solutions would like to sell lots of modules as the Tahoe is for demonstrations and development. The use of a module can greatly simplify a target design and it eliminates much of the more complex board design necessary for today’s micros. The memory complement should be about right for most applications that the processor can handle.
Microsoft .NET Micro Framework Tools
Those expecting a stripped down version of Windows CE will be disappointed. There is just not enough horsepower on most of the targets for .NET Micro Framework to handle something that large. On the other hand, those looking for support for tools like C# will be pleasantly surprised by the compatibility and support that Microsoft delivers.
As with most Microsoft platforms, board vendors deliver a board support package (BSP) that mates with the standard development tools and software that Microsoft delivers. This includes Visual Studio and the .NET Micro Framework SDK. These can be downloaded from Microsoft’s site. The SDK is free and 90-day versions of Visual Studio 2008 are available as well.
Each development kit includes documentation on integrating these tools with the BSP. The process is not trivial but straight forward and is essentially the same for most platforms.
The interface to the two boards is via a serial port. The Tahoe board has a JTAG interface but you need the matching JTAG hardware. This is faster and more effective, but the serial port is adequate for testing the board and for many applications. There is a JTAG and PLD program header for the SJJ EDK board as well.
The .NET Micro Framework is a subset of a subset with .NET sitting on the Windows platforms and the .NET Compact Framework sitting on Windows CE platforms. All include the CLR (common language runtime) that allows all platforms to run a range of languages including C# and VB.NET. Applications written in these languages, and any other compatible languages, will run on the two boards assuming the other runtime components and the application fit within the available memory.
The .NET Micro Framework actually runs a small version of the CLR called TinyCLR. TinyCLR provides core services including memory management, thread management, exception handling and debug services. It handles managed code generated by .NET compilers such as C#.
The systems use a TinyBooter boot loader to bring up the CLR and then the application. It can handle downloading of application code as well as interaction with the Windows host development system running Visual Studio 2008.
The main limitation of both systems is the serial interface. It is workable, but slow, and most developers will want to move to JTAG if they can. The Ethernet interface on the SJJ board is in development—so that high speed link is not available as of yet; and both it and the Ethernet plug-in from Device Solutions are designed for application use. The MMC/SD card support is available from DotVision.com. I was not able to test that out, unfortunately. Everything was setup as Visual Studio projects to it was a simple matter to generate the applications. Direct IO is possible using interface objects like InputPort and the complete documentation for the boards and chips is included.
Keep in mind that the .NET Micro Framework is not a real-time system unless it is placed on top of an RTOS. It provides scheduling services and worked quite well for a range of basic applications as is. It can run on top of RTOSes such as Express Logic’s ThreadX, but these will be separate purchases.
Overall, the process is not much different than using C or C++ to do development. The modularity of .NET is nice but the benefits are more likely to come from a common development platform that supports a wider range of programming languages such as Iron Python. Likewise, there is a level of compatibility between platforms that run .NET Micro Framework and even the higher level .NET environments that only similar platforms such as Java provide. Still, as you get closer to the hardware, this portability becomes more limited. In the case of most of the sample applications provided with the kits, the portability is almost nil since they rely on hardware and support within the BSP to run.
Minggu, 26 Oktober 2008
electronic music
Electronic music is music that employs electronic musical instruments and electronic music technology in its production. [1] In general a distinction can be made between sound produced using electromechanical means and that produced using electronic technology.[2] Examples of electromechanical sound producing devices include the telharmonium, Hammond organ, and the electric guitar. Purely electronic sound production can be achieved using devices such as the Theremin, sound synthesizer, and computer.[3]
Electronic music was once associated almost exclusively with Western art music but from the late 1960s on the availability of affordable music technology meant that music produced using electronic means became increasingly common in the popular domain.[4] Today electronic music includes many varieties and ranges from experimental art music to popular forms such as electronic dance music.
electronic
Electronics refers to the flow of charge (moving electrons) through nonmetal conductors (mainly semiconductors), whereas electrical refers to the flow of charge through metal conductors. For example, flow of charge through silicon, which is not a metal, would come under electronics; whereas flow of charge through copper, which is a metal, would come under electrical. This distinction started around 1908 with the invention by Lee De Forest of the triode. Until 1950 this field was called "Radio technics" because its principal application was the design and theory of radio transmitters and receivers.
The study of new semiconductor devices and related technology is considered a branch of physics whereas the design and construction of electronic circuits to solve practical problems comes under electronics engineering. This article focuses on engineering aspects of electronics.
The study of new semiconductor devices and related technology is considered a branch of physics whereas the design and construction of electronic circuits to solve practical problems comes under electronics engineering. This article focuses on engineering aspects of electronics.
Jumat, 24 Oktober 2008
Langganan:
Postingan (Atom)