Questions:
1. Is it OK to leave my wireless broadband router on all the time, even when I switch of the computer or other times when I use other programs such as Word or PowerPoint, rather than Internet. Could some one able to hack into my computer files even when I am not browsing the Internet?
2. Will someone able to hack into my computer if I use the standard firewall provided with Windows XP?
3. From the security point of view, is a wired Internet broadband connection safer than wireless broadband connection via a home wireless router which is encrypted?Top
Answers:
Some of the greatest risks are a result of simply using your internet connection, for you could visit a fraudulent or otherwise malicious website that steals personal information (see Phishing) or download a file that is infected by malware. However, even when you are performing other tasks, such as typing a Word document, or are away from your computer entirely you are still at risk. As long as your computer is connected to the internet there is the possibility of being hacked. The good news is that there are things you can do to protect yourself and significantly reduce that risk.
First, I would recommend replacing the standard Windows XP firewall. This is a recommended step regardless of whether you are using a wireless router or not. No firewall is impenetrable but the XP firewall has several weaknesses when compared to other protection on the market, with its primary deficiency being the general lack of outbound protection. In short, that means that if you do become infected and the malware attempts to "phone home" for whatever reason (including to send the hacker your personal information) it will not alert you or do anything to stop it. Instead, I would recommend using the free ZoneAlarm basic firewall, which can be downloaded by clicking here. The default settings are fine and it will automatically disable the Windows Firewall to prevent possible conflicts.
It is also important to note that depending on which router you have there may be a hardware-based firewall built in. Hardware-based firewalls are considered superior to software firewalls because they stop the access attempt even before it reaches your computer and they are harder to circumvent. (Whereas software-based firewalls can be disabled by certain types of malware.) Regardless of this, I would still recommend having a software-based firewall, such as ZoneAlarm, for an extra layer of protection.Top
In addition to a firewall you should make sure you have adequate antivirus and anti-spyware software and keep it up-to-date. Norton and NOD32 are two of the top-ranked antivirus providers with McAfee, TrendMicro, and many others following closely behind. For spyware protection I rely on Webroot's SpySweeper, though you do not need to spend the money on it. Instead, you can use Windows Defender for real-time protection and Ewido (now known as AVG Anti-Spyware) for weekly scans, a combination that is quite successful and won't cost you a cent. Remember, you can build up a wall around your network but it won't do much good if the intruder is already inside the gates, so perform weekly scans religiously.
That said, a wired connection is always safer than a wireless connection to the internet for the simple reason that with a wireless connection someone could sit in the basement of the house next door and work on cracking the encryption or uncovering your password. With a wired connection, on the other hand, they would have to have physical access to your computer network in order to attempt that. That should not deter you from using a wireless home network but instead encourage you to take additional steps to protect yourself. These include:
1.) Encrypt the connection, preferably using WPA (or even WPA2) due to the widely-exploited weaknesses of WEP. If some devices are not compatible with WPA, though, be sure to enable WEP...some protection is better than none.
2.) Set a strong password on the router, using a combination of uppercase and lowercase letters, numbers, and symbols. The longer and more complicated it is the harder it will be fore someone else to guess or crack it using brute force methods.
3.) Change the internal IP subnet, router name, and password regularly. The more you do this the more you will keep 'outsiders' off balance. You will have to update each computer afterward but it is well worth it.
4.) Enable MAC filtering so that only the wireless devices you specifically allow have access to the network. It is possible to spoof (forge) a MAC address but every road block helps.
5.) Disable SSID broadcasting. This does not prevent your network from being attacked but it prevents the network's presence from being 'announced' to all wireless receivers within range, effectively removing the large arrow pointing to your house.
6.) Keep an eye on the router logs for unauthorized access attempts, particularly if you suspect something. The sooner you see unauthorized access attempts the sooner you can take countermeasures to prevent that party from succeeding.
7.) If all of your wireless devices use 802.11g disable 802.11b on the router. This has a very limited effect, but it would prevent others with older hardware from attempting to access your network.
The exact methods of adjusting all of these settings depend on the router and operating systems in use, but you should find illustrated instructions in your owner's manual or, alternatively, on the manufacturer's website in their Help section.
Aside from that I would like to make a few other comments. First, it is important to be aware of Wake-on-Lan (WOL), which, if supported by your motherboard, would potentially enable someone to 'wake' your computer remotely via an network connection. That means that after you have shut down and gone to bed someone could start your computer and accomplish their goals before the sun rises. Thus, even if you turn your computer off at night or before you go away for any length of time you should also consider shutting down the modem and router. It's not mandatory, but it is the only way to ensure the security of your network during periods when you do not have a watchful eye on it.Top
Second, never install or use P2P software such as Limewire. Aside from the legal issues associated with most of the content being pirated it opens up a door for others to access your computer, something that would potentially make all of your hard work go to waste. It is far better to download directly from the author or use download sites such as CNET's own download.com.
Finally, keep in mind that if someone else uses your network that you may be held responsible for whatever they do, be it download pirated content or hack another computer. That is added incentive to keep your network secure and limit who you let use your computers and/or network. (Never walk away and leave your desktop unsecured to attend to other guests unless you trust them.)
Best of luck in maintaining a secure wireless home network! John Wilkinson
P.S. While it does not pertain to your home network, I would like to leave you with one last piece of information: Never do online banking on submit/access other personal information while you are using a wireless connection other than your own. There is no guarantee of security and some even set up hotspots with the sole purpose of using it to acquire the personal information of others.Top
CES 11-1-07 SanDisk’s 32GB SSD will sell for $600
Large capacity flash-based drives had been used primarily in the military, aerospace and telecom industries which demanded high performance, reliable storage under demanding conditions. But these drives were very expensive. Now, with flash-memory costs dropping, solid-state drives are becoming economically and commercially viable.
In addition to being reliable, these drives are fast. SanDisk claims a sustained read rate of 62 megabytes per second and a random read rate of 7,000 inputs/outputs per second. In plain English, that means it’s more than 100 times faster than most current hard disk drives.Top
Q. When shopping for a new computer, what kind of graphics card will be necessary to run Microsoft’s Windows Vista?
A. The Windows Vista operating system will be available in several versions with a range of system requirements, so your graphics-card needs will vary. Most hardware makers are labeling new computers as either Vista Capable to run the basic version, or Vista Premium Ready for a PC that can handle more Vista features.
To run the low-end Vista Home Basic version of the system, Microsoft states that a compatible PC needs to have at least an 800-megahertz processor, 512 megabytes of system memory and a graphics processor that can run the DirectX 9 multimedia software and has at least 32 megabytes of graphics memory.
While these hardware requirements are designed for the simplest version of Vista, they are not enough to display Windows Aero, its sophisticated graphical interface. Windows Aero - which works with the Vista Home Premium, Vista Business, Vista Enterprise and Vista Ultimate editions of the system - features shimmering effects like translucent windows and 3-D graphics.
According to Microsoft, a Windows Aero-compatible graphics card must be able to handle DirectX 9, Pixel Shader 2.0 technology and support 32 bits per pixel for color depth. The card needs at least 128 megabytes of graphics memory (although 256 megabytes of dedicated graphics memory is often recommended) and be able to run the Windows Display Driver Model software. Also known as W.D.D.M., that is the code that lets Vista and the graphics card communicate properly.
The Windows Vista home page of Microsoft’s site has the full system requirements at www.microsoft.com/windowsvista.
Many PC makers now have areas of their Web sites devoted to Vista systems and information (www.dell.com/vista, for example) and may be helpful for research. For those looking to upgrade existing computers, graphics-card manufacturers like Nvidia (www.nvidia.com) and ATI (ati.amd.com) detail their product specifications on their Web sites and tell which cards work best with Microsoft’s new system. Top
21:04 18-2-07
The typical sensor in a consumer camera is 0.5-0.7 inches. The more millions of pixels, the smaller each pixel must be-and the smaller the pixel, the less light-gathering efficiency it has, and the worse the camera performs in low-light or stop-action shots."
Lots of you said yes, the sensor size is far more important. After all, it's undisputed that a 6-megapixel Nikon D40 digital S.L.R. takes better pictures than a 10-megapixel shirt-pocket camera, because its sensor is relatively gigantic. Its individual pixel sensors can be larger and soak in more light, even if there are fewer of them.Top
15:55 24-2-07
You have to remember that graphics cards have come a long way, not too long ago they were ISA cards, then PCI came in, then AGP took over, now PCI Exress is here.
Also make sure the power supply is up to the task. If it is a brand name/retail computer, it will have a 250 to 300 watt (max) power supply. Most of the AGP/PCIexpress cards start at this level and go up. In my experience, take this seriously. If the card is hotter than the power source, you are asking for trouble. Plan on spending $50 to $100 for a power supply that is between 350 and 500 watts.
I use a 400-watt PSU..it costs about $$150.00.. I also have a PCI-express Slot and it looks way differant than a standard Expansion Card Slot.Top
17:59 1-3-07
(network) latency: Even though bandwidth is increasing, the pipes are getting filled with video, so the user experience will likely stay the same for the next three to five years.
Serial ATA (ook SATA of S-ATA) is een computerbus ontworpen voor het transport van gegevens tussen de computer en de harde schijf opvolger van de Parralel-ATA/PATA/P-ATA (Advanced Technology Attachment) of IDE-bus.
De eerste generatie (SATA150) werkt met een maximale doorvoersnelheid van 150 MB/sec. In 2004 werd een nieuwe generatie (SATA II/SATA 300) ontwikkeld die de doorvoersnelheid verhoogde tot maximaal 300 MB/sec. Men verwacht rond 2007 een nieuwe generatie die de maximale doorvoersnelheid verhoogt tot 600 MB/sec.
Serial ATA gebruikt dunnere kabels dan de flatcables van parallel ATA, waardoor er meer ruimte is in de computerkast. Hierdoor is een betere koeling mogelijk. De kabels kunnen niet omgekeerd worden aangesloten, zoals dat bij veel vooral oudere IDE kabels het geval was. Verder is er geen sprake meer van een gedeelde bus zoals bij IDE. Hierdoor is er geen onderscheid meer tussen master en slave disks.
Een belangrijk voordeel van SATA is dat drives verwisseld kunnen worden terwijl de spanning er op staat (hot-swappable).Top
Super Video Graphics Array, Super VGA of SVGA is een grafische standaard om computermonitoren aan te sturen.
De SVGA-standaard werd ontwikkeld door VESA, een consortium van monitorfabrikanten.
Zoals de naam doet uitschijnen is de SVGA-standaard de opvolger van VGA. SVGA biedt een hogere beeldschermresolutie dan VGA. Zo kan SVGA 800x600 beeldpunten weergeven. Verder is de kleurdiepte bij SVGA ook uitgebreid tot 16 miljoen kleuren.
Deze afkorting wordt tegenwoordig zo goed als altijd gebruikt om aan te geven dat het om een resolutie van 800x600 pixels gaat.
Standaard voor weergavemodus
CGA | EGA | XGA | Hercules | MDA | MCGA | QVGA | SVGA | SXGA | UXGA | VGA | WXGA
VGA ofwel Video Graphics Array is een standaard voor het weergeven van beelden op een computermonitor.
De VGA-standaard werd ontwikkeld door IBM en is door de jaren heen uitgegroeid als de 'de facto' standaard voor het aansturen van PC beeldschermen. De VGA standaard werd door IBM in 1987 geïntroduceerd.
De standaard grafische scherm van VGA is 640x480 pixels. Daarbij was de aspect ratio van de afzonderlijke pixels 1:1. Ieder pixel is dan vierkant en dat is duidelijker en mooier om te zien, en maakt ook de grafische programmatuur eenvoudiger.
De signalen van de videokaart naar de monitor was bij CGA en EGA digitaal. Maar doordat meerdere kleuren nodig waren heeft men voor analoge signalen gekozen.
Het principe van VGA-aansturing is gebaseerd op analoge aansturing met behulp van gescheiden kleuren. Deze drie kleuren, rood, groen en blauw (RGB) worden apart over een lijn gestuurd en de horizontale en verticale uitlijning wordt door een horizontale en verticale sync (=synchronisatie) aangestuurd. Het grote voordeel van RGB-aansturing t.o.v. composite is dat er met RGB veel hogere resoluties haalbaar zijn. De VGA-poort van de pc gebruikt altijd een D-type connector met 15 pinnen. De gebruikte analoge grafische kaarten accepteren aparte sync en een 'sync-on-green' signaal en een standaard bereik van 55 Hz tot ruim 75 Hz.
De VGA-videokaarten konden ook door programma's aangestuurd worden alsof het CGA of EGA-videokaarten waren. Sommige typen konden ook de Hercules-kaart emuleren.
Alhoewel er sinds 1987 vele nieuwe standaarden zijn gekomen (SVGA, XGA, etc.) met een veel hogere en betere resolutie blijft VGA de standaard die door elke PC wordt ondersteund.
Een VGA videokaart bevat 3 RAMDAC chip's waarmee digitale data omgevormd wordt tot Analoge data, zodat de gegevens op het scherm getoond kunnen worden. RAMDAC wordt in de eenheid Hz uitgedrukt.
Extended Graphics Array of XGA is een standaard voor weergavemodus. XGA is onderdeel van de VESA-standaard, en heeft een resolutie van 1024x768 pixels.
XGA moet niet verward worden met VESA's EVGA (Extended Video Graphics Array) die op hetzelfde moment vrij kwam.Top
Ubuntu OS
Ubuntu is one of the world’s most popular open source operating systems. Ubuntu is a Linux-based operating system that is community developed, updated regularly and offered for free. Designed to be user friendly, the Ubuntu OS comes with built-in software for office productivity (i.e., word processor, spreadsheets and presentation applications), e-mail, calendar, chat, web browsing, photos, and more.
The amount of RAM you have determines how many programs can be executed at one time and how much data can be readily available to a program. It also determines how quickly your applications perform and how many applications you can easily toggle between at one time. Simply put, the more RAM you have, the more programs you can run smoothly and simultaneously.Top
21-7-07
I'm expecting 32GB by the end of the year, or early next year. Now all we need is an adapter that can emulate an IDE or SATA interface so we can plug these up as a primary hard drive interface that's bootable. When that happens instead of carrying around a full laptop, all we'll need to carry is the Flash.
I can't wait for the flash-based 'hard drives' to hit the market. Gone will be the two minute boot-ups and in comes the near-instantaneous program and file opening. This has been a long time coming as the hard drive has been the slowest part of a computer since day one.
Sorry to burst your bubble, but flash is still FAR slower than hard disk drives. If you want blame for the slow boot, it's always been software bloat! Try booting up Windows 3.1 and associated applications on a modern P4 system. Blink and it's ready. Microsoft and all of its "features" are why your boot time is so long.
Sandisk has had SATA Solid State Drives (SSD) replacement disks for laptops for months now. There is a 32gb model avilable today - and a 64gb model will be available by the end of the year or early next year.
Has anyone done studies on "bit rot" with flash drives? I know there are programs/processes to address this on hard drives, esp w/ long duration programs or storage. Will the coming flash drive generation eliminate or have it's own degradation probems?
Flash drives have a finite number of "cycles" per cell (each cell representing one bit). As someone that's used flash devices as storage for embedded devices (linux wireless access points, for example) I can attest that if you use the flash memory for a volatile filesystem (/var/log on linux, for example) .. it will last about 9-12 months.
In Linux there's JFFS2 (journalling flash filesystem v2) which provides "wear levelling" -- whereby the computer is directed to keep using different parts of the media as it writes data) -- but on Windows (and in virtually every flash drive sold) it's FAT32 which has no such protection.
EFI is een afkorting van Extensible Firmware Interface en is een ontwikkeling van Intel, bedoeld om op den duur het BIOS te vervangen, dat al sinds de introductie van de IBM PC in 1981 geen grote veranderingen meer heeft ondergaan.
Dankzij EFI hoeft er voor een bepaald stuk hardware niet voor ieder besturingssysteem een aparte driver geschreven te worden. EFI neemt alle communicatie met hardware voor zijn rekening, en dus hoeft er alleen een driver voor EFI geschreven te worden. Een ander groot voordeel van EFI is de aanwezigheid van een shell, met ongeveer dezelfde mogelijkheiden als MS-DOS 3.2 had.
Over-Voltage Protector (OVP)
Een Dynamically Linked Library, ook wel bekend als DLL, is een bibliotheek met functies, die door meerdere applicaties gebruikt kunnen worden. Het is het tegenovergestelde van een statisch gekoppelde bibliotheek, waarbij de bibliotheek in elk programma dat de bibliotheek gebruikt moet ingebouwd worden.
De bedoeling van een DLL is dat de bibliotheek maar eenmaal op de harde schijf moet bewaard worden, waardoor schijfruimte bespaard wordt, en dat ze ook maar eenmaal in het geheugen moet geladen worden, terwijl toch meerdere toepassingen ze kunnen gebruiken.
Een DLL wordt geladen wanneer de eerste applicatie hem nodig heeft en kan door de kernel uit het geheugen worden verwijderd als alle applicaties die de DLL in gebruik hadden deze hebben afgemeld. DLL's worden ook gebruikt in Microsoft Windows, waarin DLL's zitten die voor de kernel gebruikt worden.
Een nadeel van DLL's is dat er in het verleden dikwijls verschillende versies van DLL's in omloop waren, waardoor er conflicten ontstonden, de zogenaamde dll hell. Tegenwoordig wordt dit verholpen door stricte versienummers te gebruiken.
Linux maakt ook gebruik van dynamisch gelinkte bibliotheken maar noemt die 'shared objects' (extensie .so).
Links www.intel.com/technology/architecture/new_instructions.htm www.intel.com/technology References Intel® Core™ Microarchitecture http://www.intel.com/technology/architecture-silicon/core/index.htm
Real mode, also called real address mode or compatibility mode, is an operating mode of 80286 and later x86-compatible CPUs. Real mode is characterized by a 20 bit segmented memory address space (meaning that a maximum of 1 MB of memory can be addressed), direct software access to BIOS routines and peripheral hardware, and no concept of memory protection or multitasking at the hardware level. All x86 CPUs in the 80286 series and later start in real mode at power-on; 80186 CPUs and earlier had only one operational mode, which is equivalent to real mode in later chips.
The 286 architecture introduced protected mode, allowing for (among other things) hardware-level memory protection. Using these new features, however, required a new operating system that was specifically designed for it. Since a primary design specification of x86 microprocessors is that they be fully backwards compatible with software written for all x86 chips before them, the 286 chip was made to start in 'real mode' — that is, in a mode which turned off the new memory protection features, so that it could run operating systems written for the 8086 and the 80186. To this day, even the newest x86 CPUs start in real mode at power-on, and can run software written for any previous chip.
The DOS operating systems (MS-DOS, DR-DOS, etc.) operate in real mode. Early versions of Microsoft Windows (which were essentially just graphical user interface shells running on top of DOS, and not actually operating systems per se) ran in real mode, until Windows 3.0, which could run in either real or protected mode. Windows 3.0 could actually run in two "flavours" of protected mode - "standard mode", which ran using protected mode, and "386-enhanced mode", which is a virtualized version of standard mode and thus would not run on a 286. Windows 3.1 removed support for Real Mode, and it was the first mainstream operating environment which required at least an 80286 processor. Almost all modern x86 operating systems (FreeBSD, Linux, OS/2, Solaris, Windows 95 and later, etc.) switch the CPU into protected mode at startup.
The Windows registry is a directory which stores settings and options for the operating system for Microsoft Windows 32-bit versions, 64-bit versions and Windows Mobile. It contains information and settings for all the hardware, operating system software, most non-operating system software, users, preferences of the PC, etc. Whenever a user makes changes to Control Panel settings, file associations, system policies, or installed software, the changes are reflected and stored in the registry. The registry also provides a window into the operation of the kernel, exposing runtime information such as performance counters and currently active hardware. This use of registry mechanism is conceptually similar to the way that Sysfs and procfs expose runtime information through the file system (traditionally viewed as a place for permanent storage), though the information made available by each of them differs tremendously.
The Windows registry was introduced to tidy up the profusion of per-program INI files that had previously been used to store configuration settings for Windows programs. These files tended to be scattered all over the system, which made them difficult to track.
I have MS Office Student and Home 2007. After 1st install, mine allows me to not register up to 25 times. They call it a grace period. However, after the 25th time, the software goes into Limited Functionality mode and I cannot edit or save anything in Word or Excel until I register the software. I have the full version. I think it's a new tactic by Microsoft. I personally don't like it.Top
Hard disks run faster and more consistently when not completely filled. Having a large block of unused disk space is essential for both speed and stability. So it’s useful to look through your hard disk and remove files that you no longer need (and you’ll probably be surprised how many there are!).
Before you begin, check how much space you currently have on your hard drive. To do this, open up My Computer, right-click on your hard disk’s icon, choose Properties, and then make a note of how many gigabytes or megabytes of free space you have.
Now go through your My Documents folder. Delete what you don’t need, or if there’s stuff you want “just in case,” consider storing it on a CD, DVD, or second hard drive so your main drive doesn’t have to wade through it.
If you’re feeling up to it, clean out your fonts. Create a folder (maybe within My Documents) called “Unused Fonts.” Then use Windows Explorer to go to your C:\WINDOWS\FONTS folder, and drag any fonts you never use into that Unused Fonts folder. (You can double-click a font to see what it looks like.) Just moving those fonts will speed up some applications.Top
It sounds simple, but most of what the average user wants to do with a computer these days can be done online: word processing, spreadsheets, e-mail, photo editing, and more, which means less storage is less of an issue. You want e-mail? Gmail and the included GTalk instant-messaging feature are free. And Google's Docs and Spreadsheets Web apps get all of your office productivity done online (though most of the three PCs have open-source versions of Microsoft Office). For watching videos, there's YouTube and Hulu.com. And rather than downloading a photo editing tool, anyone can upload their photos to Flickr and use Picnik's editing software right in the browser.
The success of Linux is, in part, driven by the fact that for people doing an increasing percentage of day-to-day tasks like e-mail in the context of software as a service, at that point it soon doesn't matter what operating system you have . If a majority of (computer) usage is browsing the Internet and doing things like that, (Linux) is perfectly credible, perfectly usable. Top
Dedicated video memory (also called discreet video memory) is memory that is on the video card (or the equivalent circuitry on the motherboard), and that is separate and distinct (e.g. using different memory chips) from the system memory.
Shared memory is when a portion of the system memory is "stolen" from the system memory for use by the video chip as video memory. This memory is no longer available as system memory. So, for example [an extreme example], if you have 512 meg of system memory (e.g. one 512 meg memory module) and a video system with 128 meg of shared video memory, your available system memory would drop to 384 megabytes (which is below the 512MB minimum threshold for running Vista). Note that in modern systems, the use of Shared video memory can be changed dynamically while the system is running, so that at one moment it might only be using 32 megabytes, and a bit later, when you are doing something that demands more memory, it might dynamically increase to 128MB.
Share memory is definitely inferior to dedicated system memory, other things being equal. But, that said, modern systems have gotten so good that shared memory chipset video is "good enough" for MOST (not all) users. Also note that many modern video systems use a combination of some dedicated memory and also have the ability to use shared memory to further increase video memory only when needed.Top
Another comment, note that in today’s world, the absolute MINIMUM video system that you should consider is one that has (or can have, given that shared memory can be dynamic) AT LEAST 128 Megabytes of video memory. Note that this might be a combination of some dedicated video memory plus some shared video memory.
If you are going to be running Vista, keep in mind that Aero (the new Vista user interface) has a MINIMUM video memory requirement of 128MB, and that Vista itself has a minimum SYSTEM memory requirement of 512MB (although it’s crazy to run Vista with less than one gigabyte, and bumping up to two gigabytes is almost always a wise thing to do).
Your requirements don’t really suggest the need for a high-end video solution (again, see my comments to Joan in last week’s question), although I might want to investigate exactly what games you do play and it might change my response ... all that I have to go by for now is "casual gaming" and "nothing hard core". All of the other stuff you mentioned is actually 2-D video, so any video solution will suffice, and because of Vista and the Aero interface, all of the video systems being offered in new computers are powerful enough to meet the needs of pretty much any casual user who is not doing serious 3D gaming or using CAD or simulation software.
[Summary: 3D video is when the video card CREATES the image to be displayed, as happens in games and CAD software. 2D video is where all of the pixels in the displayed image are simply supplied to the card. Both digital photography and digital video are 2D ... the JPEG or TIFF or MPEG or AVI or {whatever} file actually contains the pixels to be displayed, at most (for example in DVD MPEG playback) they only have to be decoded (which is trivial for today’s products), but they don’t have to be CREATED by the video card. In contrast, in games and CAD software, there is a mathematical model of some 3D world or objects (with textures and lighting), but the video card itself has to take that mathematical description and use it to actually create the pixels that you see on your monitor. This is complex and demanding, and it’s what really separates the "men from the boys", if you will. But in fact, people have a lot of misconceptions about this and when it’s required. The fact is that, except for serious gaming and 3D software, very few applications use 3D video graphics, and in particular, digital photography and digital video don’t use them, generally (one notable exception being fancy screen transitions in digital video editing).]Top
Shared Video Memory: Using part of main memory (RAM) for the display circuit's frame buffers, which temporarily hold the rendered content being sent to the screen. Shared memory is used in PCs that have the display circuit built into the motherboard rather than housed on a separate, more costly display adapter card.
Sharing main memory with the display function reduces the amount of memory available to applications, and main memory is not as fast as the specialized video memory on stand-alone cards.
On lower end systems, the video is integrated into the motherboard. The video controller uses a certain amount of 'shared RAM' for video memory. The shared ram is taken away from main system RAM. Thus, on a 512 Mb system, if the integrated controller uses '128 Mb shared video RAM', the main system only uses 384 Mb of RAM, and this is the number you'll see when checking for the memory.Top
When you go out to purchase a computer, some models specify that they have a certain amount of megabytes of memory, and they may have a video card that supports a certain amount of shared memory. This means that when the video card is in use, especially in higher display modes, it will take some of the memory normally dedicated to other computing activities and use it as its own. Thus, if you buy a computer with 512 Mb of memory and 128 Mb of shared memory and you frequently use a high display settings, you may actually only have 384 Mb of physical memory available left to your computer. In some advance systems, the use of Shared video memory can be changed dynamically while the system is running, that is, at one time it may be using only 64 Mb of memory out of 128 Mb shared and dynamically change it as the demand increases or decreases. While this may be fine for some people, if you have more money, you may want to go with a computer that has video memory dedicated to the video card, saving your physical memory for other uses. A computer with 512 Mb that uses 128 Mb of shared video memory will have a gorgeous display but run poorly because Windows Vista only has 384 Mb to use (minimum recommended memory for vista is 512 Mb and 128 Mb of Video memory).Top
Dedicated memory means that the video card uses its own memory, and doesn't share or take up the memory from your RAM. Other than that the dedicated video card would be good for graphic intense application (CAD) and video editing, games and will also help in running windows vista ( with Aero - the new Vista user interface) smoothly.
The Advantage of a video chip with shared memory is that it is cheaper, it won't be horrible, but it won't play games (lack of RAM, and I wouldn't suggest it, this excludes flash games), and video editing won't be great (but that may not apply to ripping).
I would never buy a machine with shared video memory because 20% performance loss is not worth the small price savings realized.
As far you’re requirement goes you don’t require a high level graphics card, as you said, you’ll be doing some casual gaming I’ll suggest get a lower to mid-range video card. Make sure you buy a card that matches your expansion slot type. 256MB of dedicated memory should be enough for you. You can find cards ranging from 128MB to 2GB of memory, depending on how much you want to spend. Nvidia GeForce 8400 GS/8500 GT or ATI Radeon HD 2600 Pro are my preference for a mid-range budget. Make sure the card has a HDMI, Display Port, or a DVI output. This would also help you future proof your system.Top
Given you used the term discrete video memory, this refers to Dedicated graphics cards. This provides the most powerful class of Graphics Processor Units (GPUs) that interface with the motherboard by means of an expansion slot. Two types of slots are available PCI Express (PCIe) and Accelerated Graphics Port (AGP). Either card may be replaced or upgraded with relative ease. The earlier motherboards used the Peripheral Component Interconnect (PCI) slots that are limited in bandwidth, and only used when PCIe or AGP slots are unavailble.
Integrated graphics solutions, or shared graphics solutions are graphics processors that utilize a portion of a computer's system RAM rather than dedicated graphics memory. Such solutions are less expensive to implement than dedicated graphics solutions, but at a trade-off of being less capable. As a GPU is extremely memory intensive, an integrated solution finds itself competing for the already slow system RAM with the CPU as it has no dedicated video memory.
A dedicated GPU is not necessarily removable given they may not interface with the motherboard in standard manner. Dedicated refers to the fact that the dedicated graphics cards have RAM dedicated to the card, not that most cards are removable.
In conclusion given you are interested in a later upgrade you should purchase a modern desktop computer with PCIe slot with capability to add a PCIe card with video memory of at least 512MB. This is required based upon the newest games even if casual. Older games run on 256MB. However, if you purchase the newest even casual games, they are demanding. Integrated video memory is like a rock-and-a hard place, given the processor demands more memory with increased processor requirements, thereby decreasing the video memory for games, photo editing, or watching videos online.Top
When you have shared memory, the video "card" is integrated into the motherboard. It use the same memory as the rest of the system. The video memory used is SUBSTRACTED or removed from the main memory. This can be a big inconvenient. Another problem is that the memory bandwidth is shared between video processing and computing. You can set the amount of main RAM to be used for the video, but you need to change it from the BIOS settings. You find that kind of settup on cheap motherboards, usualy with prety limited memory expandability. It can be OK for you if all you do is surfing the internet, lisen to music, do some acounting and text processing, and only do some casual gaming like solitair and winmine. You may get some degradation with some video, especialy for anything high definition. Advantage: LOW price. Inconvenients: Reduce available RAM, lowest end video processor, can't upgrade. The motherboard's BIOS may NOT allow you to use a separate video card. NO PCIe nor AGP connector.
Dedicated is similar to the preceding. The motherboard contains a "video card" builtin, but functionaly separated from the rest of the computing circuitry. The advantage is that is't not shared with the rest of the system and have it's own data bus. The video processing can no longer interfere with the computing. OK with about any video playing. Video editing may cause problems. Most games will play with acceptable performances. Upper mid range games and above will probably show jerky animation in the most intence parts. Middle price. The "card" can't become unseated. If you want to install a video card, you may need to get a PCI card as those boards seldom have a dedicated video card connector.Top
With the preceding 2 options, if there is a video processor failure, you need to change the mother board or install a dedicated video card.
Discreet is video RAM that reside on a video card. That memory is completely separate from system RAM. The video processing is also independant from the computing. Some of those video cards can have 512 Mb or even 1 Gb of video RAM! TOP performances are possible... for a price! The price range from low (about $70) to outrageous ($1000 and more).
Most video cards come with at least 128 Mb of video RAM. You can still get some with less, but they may not allow you to use that LCD at native resolution, and the saving, if any, will be negligeable.
In your case, I'd stay well away from the shared kind. Video cards are easy to change, integrated video can't be changed.
Considering that even some "tame" video games can be very graphics intensive, and that you may discover a taste for some more intense gaming, I'd go for the dedicated video card with 256 or 512 Mb of RAM.
Photo editing don't ask much from the video card, any will do. It's a CPU/main system RAM task. The next step, video editing, ask for as much main RAM as possible and a fast hard drive, but very little for the video RAM, any lower mid range will do.
As for the rest of the computer, look to get as much main RAM as you can. Also, a second hard drive where you put your datas would be a very good idea. That way, when you'll need to reinstall windows, all your data will be safe without the need of backing it all up.Top
21-5-2008 VIA Isaiah cpu
The Via C7 processor is currently being used in a design that may herald more Isaiah-based mainstream notebooks. The Everex gBook $398, with a 15-inch screen, a 1.5GHz Via C-7M processor, 512MB of DDR2 system memory, a 60 GB hard disk drive, optical drive, Ethernet, and wireless. It uses the gOS Version 2 operating system, a Linux distribution.
An idea put forward by Nvidia's CEO Jen-Hsun Huang, it postulates that a consumer will get better PC price-performance by adding a $50 graphics card rather than a two or three hundred dollar quad-core processor.
You can have a processor like Isaiah matched with a better graphics card. There's opportunity in both desktops and notebooks.Top
21-5-2008 Dead flash drives
I know of one circumstance where any Windows XP machine will fail to recognize a flash drive. This may or may not be your problem. Some iterations of Windows OS require portable drives to be stopped and ejected via the 'safely remove hardware' wizard before any data is actually written to them. When data is copied onto the portable media in this situation, Windows will show that it has been copied, but will actually keep a log of the intended data transfer without carrying it out (Windows XP uses a delayed-write cache to speed up programs). When a user 'properly' removes a portable drive through the remove hardware dialog, the logged data transfer will be actually performed and the files transferred to the device and then Windows tells itself that the drive is not connected. Finally, it tells you that the flash drive can now be removed.Top
Almost all flash memory devices use some form of 'hot pluggable' interface to connect them with the various electronic devices they support. Hot pluggable means that the memory can be attached and removed from a powered-on device without fear of damage or hardware failure. USB is the most obvious example of this technology, and one that we are all familiar with. The one problem with this type of interface is the sense of invulnerability it engenders in the user. We become so accustomed to inserting and removing our flash memory devices at will that we often forget to make sure that all data transfer tasks have stopped first.
Trouble arises when users simply yank the USB media out of the computer without using the safely remove hardware option. There is no surer way to mess up a portable storage device than to yank it out of its socket when it is halfway through an operation. If you just pull the flash drive out without doing the Safely Remove Hardware step, you may lose the data that you thought was already written to the flash drive. Also, Windows won't properly turn off its internal settings for the drive. That means, if you insert the drive again, Windows XP won't recognize it.
Also Check in Disk Management and try to "assign" a drive letter
It is also possible you're plugging this drive into a USB hub?
Try plugging it in directly to one of the USB ports in the back of your computer.
Drivers shouldn't be an issue as XP uses a generic USB mass storage driver which works with pretty much all thumb drive (and most external hard drive) chipsets.
If there's nothing important on the thumb drive, you can try repartitioning and formatting the drive. It doesn't make sense it would work in one Windows XP and not the other, but I've seen it happen.Top
Try this: Start > Run > diskmgmt.msc
CAUTION: Do not alter any other drives. ONLY your thumb drives! Find your thumb drive in the lower pane of the window, right click on the coloured partition blocks and triple check to make sure it is your thumb drive before continuing. Proceed to delete your thumb drive's partition(s).
Right click in the empty space of your thumb drive and create a new partition. Right click and format it in FAT or FAT32. It should work. And if you don't see your thumb drive listed, then something is wrong at the driver/hardware level.
Flash memory has a finite lifespan measured in erase and write cycles. That is to say, a specific block of NAND memory can only be written to and erased x number of times before it fails to reliably store data. In modern flash devices, this number generally extends to millions of operations, and longevity is further ensured by an algorithm built into the supporting circuitry of the memory that forces data to be written evenly across the available memory blocks, preventing one area of memory from becoming more 'worn' and failing faster. Supplementing this is another system which ensures that 'worn' sectors are mapped out of the grid of available memory, similar to the method used to deal with bad sectors in hard disk drives. Flash memory can and does wear out though. While a typical USB drive or memory card should last years or decades of typical use, exposing flash media to more read-write intensive operations like running an operating system or hosting applications will cause premature wear and tear and the eventual failure of the device. So never defragment these drives, by defragging, you will wear them out faster.Top
I have been using my USB flash drives / hard drives for years and only one has stop working and that was a hard drive not a flash drive. I think it is extremely rare that three thumb drivers will go "bad" simultaneously. This suggests that there may be a problem with the way you are using them or in the environment you are using these drives. Also check that the USB extension wires that you are using are fine or not. You could also be a victim of a bad lot or fake copies of the drives. You didn’t mention whether you bought them together and from the same store or not. I bought a Kingston or what I thought was a Kingston 16 GB DataTraveler it turned out to be fake.
As for the data recovery is concerned you can go in for a professional help if the data is critical or opt for data recovery software which may or may not work in your case. While attempting recovery of data from a corrupted drive, your success here will vary depending on what exactly is wrong with the flash memory device in question. If the file system has been scrambled due to some devices performing an unexpected action or not reading the drive correctly, you may well be able to recover your data using special data recovery software.
On the other hand, if your device is failing due to physical damage or wear and tear, data recovery depends entirely on what part of the flash memory is damaged. One positive is that, unlike hard drives, flash devices have no moving parts and thus do not generally fall victim to the 'snowball' damage effect, where data recovery efforts on a faulty drive inflict more damage on it even as they rescue some of the data. Top
On a Windows system ALWAYS use the green icon in the task bar to stop the drive and wait for the confirmation that it is safe to remove the drive. This will ensure all files are closed and buffers flushed. If you don't get the confirmation message, leave the drive in place until you shut the computer down. Any other method risks corrupting the data or worse, the FAT.
Don't defrag these drives. The drives do have a finite number of write cycles and by default, the drives will scatter the writes across the whole drive. By defragging, you move the date back to the starting sectors and you will wear them out faster. Fragmented flash drives do not have the performance problems associated with spinning disks.
Looking though your list of potential damages that you know haven't happened, the only omission I can see is a static discharge. Most flash drives, at least the metal cased ones, are reasonably well protected but a static discharge could affect the drive if it hit the connector. Try not to touch the actual connector.Top
Monitor (LCD TV) PPI (Pixels per inch):
en.wikipedia
Het lezen op een scherm is vooral vermoeiend omdat er eigenlijk een felle lamp in je gezicht staat te schijnen. Met e-paper is het dan ook al sinds de eerste stap in de ontwikkeling de bedoeling geweest om vooral reflectie te gebruiken van licht uit de omgeving, net als met echt papier. In het artikel wordt dan ook gesproken over 40% reflectie, wat nog steeds een aardig felle lamp oplevert.
Bij reflectie is deze een percentage van het omgevingslicht, dus de felheid er van valt wel mee (hoewel het is de volle zon ook niet prettig is om een wit boek te lezen).
De gebruiker schuift met de muis de cursor naar een icoontje of een menubalk, klikt erop en vervolgens springt een nieuw venster open of valt een serie opties naar beneden.
De fundamentele doorbraak van de grafische gebruikersinterface was dat de perceptieruimte en de actieruimte bij elkaar werden gebracht.
Daarvoor was het beeldscherm de perceptie- en het toetsenbord de actieruimte. De muis brengt als een soort prothese de actie van de hand over op het scherm.