Wednesday, April 30, 2008

Introduction To ISDN, Part II

In the previous ISDN article, we looked at how and why one router dials another using ISDN. Just as important is knowing what keeps the link up once it is dialed.

Why? Because ISDN acts as a phone call between two routers, and it’s billed that way to your client. The two routers that are connected by this phone call may be located in different area codes, so now we’re talking about a long distance phone call.

If your ISDN link does not have a reason to disconnect, the connection could theoretically last for days or weeks before someone realizes what’s going on. This is particularly true when the ISDN link is used as a backup for another connection type, as is commonly the case with Frame Relay. When the Frame Relay goes down, the backup ISDN link comes up when the Frame Relay link comes back not billed for all that time.

To understand why an ISDN link stays up when it’s not needed, we have to understand why it stays up period. Cisco’s ISDN interfaces use the idle-timeout to determine when an ISDN link should be torn down. By default, this value is two minutes, and it also uses the concept of interesting traffic.

Once interesting traffic brings the link up, by default all traffic can cross the link. However, only interesting traffic resets the idle-timeout. If no interesting traffic crosses the link for two minutes, the idle-timer hits zero and the link comes down.

If the protocol running over the ISDN link is RIP version 2 or EIGRP, the most efficient way to prevent the routing updates from keeping the line up is expressly prohibiting their multicast routing update address in the access-list that is defining interesting traffic. Do not prevent them from crossing the link entirely, or the protocol obviously won’t work correctly.

With OSPF, Cisco offers the ip ospf demand-circuit interface-level command. The OSPF adjacency will form over the ISDN link, but once formed, the Hello packets will be suppressed. However, the adjacency will not be lost. A check of the adjacency table with show ip ospf adjacency will show the adjacency remains at Full, even though Hellos are no longer being sent across the link. The ISDN link can drop without the adjacency being lost. When the link is needed, the adjacency is still in place and data can be sent without waiting for OSPF to go through the usual steps of forming an adjacency.

This OSPF command is vital for Cisco certification candidates at every level, but is particularly important for CCNA candidates. Learn this command now, get used to the fact that the adjacency stays up even though Hellos are suppressed, and add this valuable command to your Cisco toolkit.

One myth about ISDN is that Cisco Discovery Packets keep an ISDN link up. CDP is a Cisco-proprietary protocol that runs between directly connected Cisco devices. There is a school of thought that CDP packets have to be disabled on a BRI interface in order to prevent the link from staying up or dialing when it's not really needed. I've worked with ISDN for years in the field and in the lab, and I've never seen CDP bring up an ISDN link. Try it yourself the next time you're working on a practice rack!

Keep studying!

Sunday, April 27, 2008

Will The Online Quiz Make The Old Fashioned Printed Quiz Obsolete?

Q. Can you tell me the benefits of publishing an online quiz versus a printed one?

A. Well, the first benefit of publishing an online quiz that comes to mind is the fact that it's an outstanding way to drive traffic to your web site. People love quizzes and they take ones that are printed in magazines all of the time. It's reasonable to assume that if these same people knew about the existance of an online quiz that they would beat a path to your home page.

Another great feature of offering an online quiz is that you can change the subject regularly. In fact, you could offer a subscription where members sign up to be notified whenever a new online quiz gets posted.

An online quiz can also be used as a sales tool. One of the best ways of accomplishing this is to link your quiz questions to affiliate sites where you earn money when people make purchases. Then you can have a "scavenger hunt" online quiz where people have to visit those sites to find the quiz answers.

People also like to take an online quiz because they can get instant scoring without having to turn the page upside down, or flip to the end of the magazine, to find out how they did. You can even design your online quiz so that it gives instant feedback after every answer is entered.

One last thing to consider is that you can create keyword-rich versions of your online quiz and then post the URLs to search engines. That way you will not only get search traffic from people who like quizzes, but you'll get traffic from people who are searching on the terms that you include in your online quiz!

So, while it's not likely that a printed quiz is going to become obsolete any time soon. The Internet has once again forever impacted the way that we communicate. Today's latest example? The online quiz.

Saturday, April 26, 2008

Provisioning/User Management System Upgrades: Part Two – Building Awareness And Building Approval

Somewhere in the world is a person who wants to see their provisioning/user management systems get a sorely needed upgrade. But they seem to be getting nowhere.

The technical requirements are unarticulated. Key decisionmakers in the company are not aware this is needed. And the “project” is funded and without resources. How can someone who has the responsibility but not the authority get this upgrade to the next level?

This article will provide practical guidelines on how to build awareness and get funding for a provisioning/user management upgrade. Upgrade is meant to include new hardware and software and also the supporting environment of business processes, roles, organizations, business rules, etc. This discussion will include techniques for overcoming the approval and implementation obstacles detailed in the previous article.

1. Know What You Are In For

Congratulations, you just signed up to become a change agent. They are easy to recognize because they have the arrows in their back! Usually they find their own rewards – pride of having improving their company’s infrastructure against all odds, making new acquaintances along the way, and satisfaction of a job well done.

Note it may take longer that you expected. There will be times of great progress and others of utter despair. Just keep at it and you will achieve your goal. And don’t forget to have fun while you are doing it.

2. Document the “AS-IS” Environment

This should include not only the environment but also current metrics for id creation, deletion, changes across business units, and special cases. Great care should be taken in identifying gaps and risks in the current environment.

3. Document the “TO-BE” Environment

Create your own network identity roadmap if one does not exist. Base it on a combination of your own and colleague’s ideas, existing company policies and procedures, best practices, consulting think tank recommendations, and whatever else makes sense. Revise it as organizational and vendor realities change. By “setting a line in the sand” with your network identity vision, you will force other to either agree with you or identify their own assumptions, risks, and implementation next steps.

4. Communicate Often

You can never communicate enough about IT infrastructure needs! Use a variety of mechanisms to tell the story such as electronic/printed newsletters, bulletin boards (physical and electronic), web site, blogs, face to face, speaking at department meetings, in-house seminars etc.

There are many things you can communicate – stories about the unsung heroes and heroines of provisioning and user management, current metrics, appropriate external meetings, webinars, and seminars, the “AS-IS” and “TO-BE” environments, successes at other companies, the unsung heroes and heroines of provisioning and user management, and of course stories illustrating user pain thresholds. Keep the information interesting, educational, and continuous.

5. Leverage Off of Other Company Success Stories

This may some work. Continuously scan the trade magazines and the web for other company success stories. Get the technical and management contact names if possible.

Try to get hold of them. See if both types of contacts are willing to share their key documents with you and how they measured success. Even better, see if you can get them to speak to the corresponding folks at your company (Especially the management contact talking with your management.)

6. Use Vendors and Outside Consultants

Use them to educate your management and technical decisionmakers through webinars, seminars, and on-site meetings. Webinars are appealing because they are usually free, relatively short (typically one hour), can be done from your desktop, provide an opportunity for asking questions and also saving a copy of the presentation.

Outside consultants may be helpful by coming on-site and reinforcing your efforts. This may include a talk on the state of network identity, evaluating your current provisioning/user management strategy, discussing current and future vendor releases etc.

7. Know Your Company’s Resources

Once you underway with your effort, you will see people fit into these categories:

* Allies -- These are temporary and permanent employees who approve your general framework of problem diagnosis and proposed resolution. Keep these people best informed of all three categories through informal e-mails, 1 on 1 conversation, “brown bag lunches” and brainstorming sessions. Use them to spread the word when appropriate. Possible candidates are those actually doing provisioning/user management, data management, security, HR, IT, and remote/roaming users.
* Potential Allies – This type of employees may take some convincing. But once convinced, they are on your side forever. They may need to be convinced through webinars, vendor talks, interfacing with their peers in other companies who have successfully implemented a provisioning/user management system, attending a conference etc. Find out what their objections are and work on overcoming them. Constantly communicate to them about user pain, successes in other companies especially metrics before and after implementation.
* Challenges – This employee will need the most convincing because of education, financial, emotional, and political concerns. Unfortunately, they are probably your approvers and will likely give you the least amount of time and attention. Your encounters need to be well planned and timed. You should have reached a broad level of consensus and awareness on this issue. The problem and remedy should be clearly defined and documented. It could prove invaluable to read several books or take a course on relationship selling.

Conclusion

You can succeed at getting a provisioning/user management upgrade at your company. It will take a combination of great timing, targeted communications, both “hard” and “soft” skills, and the right people backing you up. Good luck and please write to me about your progress along the way.

Wednesday, April 23, 2008

Provisioning/User Management System Upgrades: Part One -- Ten Reasons Why Not To Do An Upgrade or The Gentle Art of “Not Doing” When Good Systems Go B

Tommy Sherman daily monitors a helpdesk-provisioning queue for a large company. The current provisioning/user management system was written with homegrown software. It has not had a major update for several years. Each day, he is getting more and more frustrated. No matter how hard he tries, he cannot keep up with the increasing workload. New employees are screaming for their system ids and have no way of checking their id creation status. Existing employees are demanding timely updates to their ids when they transfer across business units. Ex-employees exist in the system months after departure. His frustrated manager will be meeting with him this afternoon to talk about his “unresponsiveness.”

The above is a real world example. It may happen if a provisioning/user management system is not meeting company needs and there are no plans to upgrade.

This is a two part series on the dark side of provisioning/user management upgrade projects. Upgrade will be defined here to include new hardware and software, and also the supporting environment of business processes, roles, organizations, business rules, etc. This article will discuss reasons why these projects do not get started or fail to reach completion. The next article will cover how to overcome these reasons.

Here are ten reasons why the needed systems improvement are not implemented:

1. No Budget

IT budgets are frozen or only the most needed projects get funded. This will impact hardware/software maintenance, hiring or contracting needed resources, and more.

2. Infrastructure is Not Sexy.

The budget is there but fixing an existing provisioning system is not considered a priority. Sadly, many companies see broken systems or processes as the “cost of doing business.” Or companies will do only the minimum upgrade to keep IT infrastructure running. But beware, as once was said in a well-written article “Cheap is Expensive.” It will come back to haunt you.

3. No Technical, Management, Or Financial Champions

It may be a great idea but there may not exist anyone who can sell this at the mid or upper management level in your company. Also, you may experience “champion burnout” – where past champions who unsuccessfully tried to sell the upgrade no longer wish to do it again.

4. Business Case Is Hard to Write

Only by including both “soft” and “hard” savings can can one get the true picture of an upgrade’s return. “Soft” costs include user login downtime and productivity declines (cost of finding the current information about a person, document, or hardware device), increased calls to helpdesk and decline of helpdesk staff morale) and more. However, “soft” savings are often considered irrelevant by management and usually the numbers aren’t there if you rely on hard savings alone.

5. Can’t Agree on Software/Hardware

For various reasons, technical types cannot always agree which is the best software to meet company needs. Differences may be over preferred operating system, vendor, hardware, software configuration and features, or political/personal whims.

6. Undocumented Current Environment

Perhaps due to turnover or lack of time, no one has documented (or recently updated) what the “AS- IS” user management/provisioning environment looks like. This includes roles and responsibilities, business rules and processes, and software/hardware.

7. No Shared and Communicated Vision

No one has written and communicated a possible “TO-BE “ roadmap for provisioning/user management software to decision makers and influencers. This may be due to lack of understanding of the “AS-IS” environment, politics, lack of time, or lack of knowledgeable resources to create such a roadmap. To ensure overall success, the “TO-BE” roadmap ideally should advocate a phased approach.

8. No Project Resources

All available staff who would be working on a software upgrade are busy doing other tasks (like system administration, user support, or other projects). So, there are no available resources that can be dedicated full/part-time to the project. Also, the company may be reluctant to hire outside consultants to perform the upgrade for various reasons.

9. No Agreement on Upgrade Requirements

It is possible to agree on vision, product, and project team and still get nowhere! Reasons could be an honest difference of opinion on configuration settings, hardware setup, features to enable, degree of customization, and more. Unclear and disputed requirements from the start will likely bring disastrous results.

10. Other Concerns

These are other factors too numerous to mention which could impact getting an upgrade project off the ground. – security concerns, lack of physical space for hardware, no organization/resources for administration, remote locations building their own unapproved “underground” solutions, organizational changes and mergers (with new organizations having their own IdM vision), vendor changes and mergers, and more.

Conclusion

I hope that this does not discourage you from moving forward on getting your provisioning/user management underway. By identifying possible obstacles, you can then begin to plan to overcome each of them. In the last article of the series, we will discuss what you can do to get your provisioning /user management upgrade on management’s radar.

Monday, April 21, 2008

Whizlabs MCSD .NET 70-316 Certification Primer

Earning a Microsoft certification acknowledges your expertise in working with Microsoft products and technologies and sets you apart from the crowd as a development professional. Microsoft certification demonstrates that you have the ability to successfully implement Microsoft business solutions for your organization or client.

This article will provide a basic understanding of the scope of the certification and will also give details about the certification along with the useful resources to get started.

Introduction

With .NET, Microsoft is also espousing a vision, this time about how the Internet can make businesses more efficient and deliver services to consumers. The present enterprise setup's need to have n-tier architecture with diverse platforms and object models communicating with each other. The present applications are to be created such that those applications have to run in any platform (like Windows, Linux, Mac, Unix etc.) and which consist of components written in many programming languages and object models.

Many language vendors have tried to upgrade their languages, but there is a limit to which they have been successful since they have to maintain backward capability and face many other problems too. To solve the problem of the current programmers Microsoft has come with a very promising solution "The .NET Platform".

.NET provides a number of benefits that will make developers more productive, reduce the number of bugs, speed application development, and simplify deployment. IT managers are understandably wary, since .NET is a new technology that requires a moderately steep learning curve. For most organizations, however, the benefits will far outweigh the negatives; and with .NET, you'll see great productivity gains for future development projects.

.NET certification is the premier credential for professionals who design and develop leading-edge business solutions with Microsoft .NET development tools, technologies and platforms.

You might want to consider taking the MCSD 70-316 exam to:

* Gain valuable skills, knowledge and expertise
* Raise your income
* Increase your job opportunities
* Earn more respect from your peers
* Improve your job security

Earning a Microsoft certification acknowledges your expertise in working with Microsoft products and technologies and sets you apart from the crowd as a development professional. Microsoft certification demonstrates that you have the ability to successfully implement Microsoft business solutions for your organization or client.

This article will provide a basic understanding of the scope of the certification and will also give details about the certification along with the useful resources to get started.

What you need to know for MCSD 70-316?

As a competent developer, you should be proficient in creating Windows applications that have a smooth look and feel.

In the Exam 70-316, Microsoft tests your skills on developing Windows based applications with VS.NET/C# on seven objectives.

Exam Objectives Weightage of Questions

1. Creating User Services 15 %
2. Creating and Managing Components and .NET Assemblies 18%
3. Consuming and Manipulating Data 20%
4. Testing and Debugging 12%
5. Deploying a Windows-based Application 18%
6. Maintaining and Supporting a Windows-based Application 5%
7. Configuring and Securing a Windows-based Application 12%

For a detailed break up of topics covered in the MCSD .Net 70-316 exam, you can visit the Microsoft certification web page.

The above certification page will provide you with a bird’s eye view of the skills you should focus on. During the course of this article I’ll point you to resources where you can go and read up on the skills listed on this page. The Skills being measured section of the page has the break up of the seven objectives as to which skills will be measured in the exam.

Microsoft’s exam 70-316, “Developing and Implementing Windows-based Applications with Microsoft Visual C# .NET and Microsoft Visual Studio .NET”, is a core requirement for the MCSD (Microsoft Certified Solution Developer) for Microsoft .NET certification and is a core or elective requirement for the MCAD (Microsoft Certified Application Developer) for Microsoft .NET certification. It is designed for candidates who “work on a team in a medium or large development environment that uses Microsoft Visual Studio .NET, Enterprise Developer Edition.”

The MCSD 70-316 exam measures your ability to develop and implement Windows-based applications by using Windows Forms and the Microsoft .NET Framework. Candidates have at least one year of experience developing Windows-based applications. Candidates should have a working knowledge of Microsoft Visual C# .NET.

MCSD 70-316 Exam Specifics

All the questions are Multiple Choice Questions and the total number of questions in the exam ranges from 55 to 60. The time allotted is 150-175 minutes. This exam is moderately difficult and costs 125 USD. It may be taken from Pearson Vue or Thomson Prometric. Previously, Microsoft used to provide only Pass/Fail status for the MCSD 70-316 exam, but now 700 is the minimum score required. Microsoft has also incorporated a new style of question where you get a split-screen. The question is at the very top, drag-and-drop items on the bottom left and configuration screens on the bottom right. You will have plenty of time to answer the exam questions, so there is no need to rush. If you have spare time available, you can double check the questions and ensure that you have read them correctly and actually answered the question as intended.

What you need to do to pass MCSD 70-316 exam?

First, you should get involved in overview of .NET framework and also identify the tools and services provided by .NET framework. After getting familiar with the tool and services available, you need to get well versed with the windows forms. To get started on Windows forms you can read the article available on MSDN. It'll give you a good introduction to Windows Forms.

.NET is completely object oriented. It is likely that you will see a question or two on OOP concepts. You need to get a through understanding of OOP Concepts like Encapsulation, Polymorphism, Exceptional Handling etc. Interoperability is an important area to focus on while preparing for MCSD 70-316 exam. Active X controls of the pre .NET age are not directly supported by Winforms but they are completely re-usable using wrapper classes. You will be asked a question or two on the accessibility features of Windows for the physically challenged users.

After getting familiar with user services, you should get familiar with how to create and manage components. This involves dealing with how to create and manipulate .NET assemblies. In this regard satellite assemblies seem to be one of the favorites of Microsoft for MCSD 70-316 Exam. So you may get some questions related to satellite assemblies and not to mention, resource assemblies also.

When it comes to manipulating data in .NET, there are two sources of data you would widely be dealing with - XML & Relational Databases. You will be using ADO.NET to interact with relational databases. A good grasp on ADO.NET is very essential for clearing this paper. MSDN provides you an article that gives you a good overview of ADO.NET.

I will also recommend Professional ADO.NET from Wrox Publications as a good resource. Professional C# also has a very good section on ADO.NET.

Another objective checked in the 70-316 exam mainly concentrates on your familiarity with the Visual Studio .NET debugger. You need to have a clear understanding of the tools provided by Visual Studio .NET for debugging the application. Apart from Visual Studio .NET this section largely stresses upon Tracing. Tracing is a feature by which you can write logs to a particular location (could be a text file). You can change the level of tracing you want to perform in a live production environment.

One of the advantages of using Visual Studio .NET is the ease of deploying an application. There are many deployment options available in .NET and you must have a clear understanding of using these deployment options. You will get many scenario-based questions in which your understanding of deployment options will be checked.

There will be a few questions that stress more on optimizing code/deployment to increase performance.

Configuring a Windows application is also one of the areas that will be tested and you need to have an understanding about the configuration options available. MSDN provides an article that gives useful information about the configuration options.

Security is one of the key considerations of any application. Microsoft has taken a great initiative to give many options to secure the .NET application. You must ensure that you have a thorough understanding about the security policy and also about the tools available in .NET to secure the application.

Tips and Pitfalls for MCSD 70-316 Exam

Roadmaps to the skills development and career opportunities that the Microsoft .NET platform provides for developers and recommended learning pathways through Official Microsoft Learning Products and Microsoft Press books are available now. In addition to your hands-on experience working with the product, you may want to take advantage of the tools and training on the Get Started with Microsoft Visual Studio .NET Development page to help you prepare.

Dos and Don’ts during the exam and while preparing for it

Dynamic control creation is important to know. Microsoft is focusing more on user-driven content than static content with .NET. Make sure you understand how to create .NET satellite assemblies and localized components. A portion of this has to do with what operating system name and version you have, but there are specific steps you can take to package and deploy your application properly. Focus your data studies on what works optimally with SQL Server.

Brush up on stored procedures if you're a little rusty. Understand the different types of SQL Server authentication as well as the different types of authentication offered by Windows operating systems. Sure you can pick out code errors when given examples. Specifically, know how to resolve looping errors, import errors, and general syntax errors.

Make sure you understand Windows Installer technology and how it relates to applications written in Visual Studio .NET. If you're not a developer who's focused on enterprise-based applications, you may not have a lot of exposure to localization and globalization of software. This is one area where Microsoft believes a certified developer should have a significant background.

Make sure you understand how security works in .NET. Understand which accounts have privileges to run which sections of code. Although it may be a little bit of work, study the .NET policy extensions and how they fit into the operating systems your company is running. We must not forget legacy environments.

Make sure you brush up on ActiveX controls from Visual Studio 6.0. Know how to instantiate the control and work with it in the new environment. It is especially important to understand how legacy components interact with the .NET development environment.

Learn how to use the DataSet object to generate XML and vice versa. You should know how to create schemas and validate the data, as well as create strongly typed datasets. Overall, you are expected to have a thorough understanding of XML as in .NET to succeed in MCSD 70-316 exam.

Conclusion

With Microsoft targeting all its future applications on .NET and companies adopting the Microsoft new initiative, it is quite obvious that .NET is going to pave its way for the future and will be Microsoft’s weapon for its future vision. .NET is a major leap of Microsoft with intranet in mind. Getting knowledge of .NET and eventually the MCSD .NET certification will certainly help you in maintaining consistency with the future releases of Microsoft.

Useful Resources

Books

Following good books are available at the time of writing this document, which are written specifically from MCP exam point of view.

* MCAD/MCSD Self-Paced Training Kit: Developing Windows-Based Applications with Microsoft Visual Basic.NET and Microsoft Visual C#.NET, Second Edition by Matthew A. Stoecker/Microsoft
* MCAD/MCSD Training Guide (70-316): Developing and Implementing Windows-Based Applications with Visual C# and Visual Studio .NET by Amit Kalani
* MCAD/MCSD EXAM (70-316) QUESTIONS AND ANSWERS, WITH HANDS ON LABS, Developping WINDOWS APPLICATIONS WITH Visual C# [eBook: Adobe Reader] by Mike Wright

Other than these, following are some good books on the subject:

* Programming C#, Third Edition by Jesse Liberty
* Windows Forms Programming in C# by Chris Sells
* Professional C#, Second Edition by Simon Robinson, K. Scott Allen, Ollie Cornes, Jay Glynn, Zach Greenvoss, Burton Harvey, Christian Nagel, Morgan Skinner, Karli Watson
* Microsoft Visual C# .NET Language Reference by Microsoft Corporation
* Inside C#, Second Edition by Tom Archer, Andrew Whitechapel
* C# and the .NET Platform, Second Edition by Andrew Troelsen

When you are ready to prepare for MCSD 70-316 exam, here's where you should start.

Instructor-led Courses for this exam include:

* Course 2389: Programming with Microsoft ADO .NET
* Course 2555: Developing Microsoft .NET Applications for Windows (Visual C# .NET)

Microsoft Online Resources

* TechNet: Designed for IT professionals, this site includes How-to’s, best practices, downloads, technical chats, and much more.
* MSDN: The Microsoft Developer Network (MSDN) is a reference for developers, featuring code samples, technical articles, newsgroups, chats, and more.
* Training & Certification Newsgroups: A newsgroup exists for every Microsoft certification. By participating in the ongoing dialogue, you take advantage of a unique opportunity to exchange ideas with and ask questions of others, including more than 750 Microsoft Most Valuable Professionals (MVPs) worldwide.

Apart from that, there are other useful sites as follows:

* www.Gotdotnmet.com
* www.codeproject.com
* www.csharpcorner.com
* www.developer.com
* www.mcpmag.com

Of course - nothing can top the MSDN site for technical content - it is a very comprehensive resource - however, it’s a bit too comprehensive, as it caters to a larger set of developers than just the ones developing applications on C#/Windows.

Exam Simulators

Whizlabs, the market leader in IT Certification Exam Preparation, provides MCSD .NET 70-316 Exam Simulator. The simulator is quite useful and effective in fine-tuning your preparation within a limited timeframe.

Certification Forums

Certification forums are a very good resource to align you preparation with your peers. You can discuss any insight, problem, or issue with like-minded professionals and keep yourself updated all the time through the forum. Whizlabs MCSD .NET Certification Forum facilitates such a platform for you to succeed in the 70-316 exam and enhance your learning experience.

Saturday, April 19, 2008

All about the new SCMAD Certification Exam

Introduction

The mobile market is envisioned as the next technological wave by leading industry experts. With approximately 150 million mobile phones – roughly 3 times the user base as that of desktop computers – it might well be the case. Due to the fragmented nature of the mobile market, with various manufacturers competing to get their share of the pie, Java is once again poised to be the best programming language for the mobile market with its Write Once, Run Anywhere technology.

The specifications around the Java for Wireless Technology initiative have been proposed and backed by most of the leading mobile phone manufacturers (Nokia, Sony Ericsson, Siemens, T-Mobile, to name a few) and hence, one can expect device support and continued innovation.

The Java 2 Platform, Micro Edition (J2ME) offers a highly optimized virtual machine which can be used to run Java applications on devices ranging from resource constrained devices like smart cards, pagers, and mobile phones to high end devices like handheld computers and set-top boxes.

Keeping the limitations of the mobile devices – both in size and in memory – in mind, the Java Community Process has developed a series of standards constructed in a modular fashion to ensure that various features are standardized while keeping the architecture at an abstract level.

While the J2ME itself includes a lot of other features and is also not limited to mobile phones, the certification concentrates on the developer’s ability to create and install programs for mobile devices, such as cell phones and Personal Digital Assistants (PDAs).

You might want to consider taking this exam

* If you are already into writing J2ME applications for mobile devices and want to be recognized for your skills.
* If you are a developer who has already written a few programs for wireless devices, the preparation will give you in-depth knowledge of various concepts.
* If you are a seasoned J2SE/J2EE developer and want to start writing highly optimized Java applications for mobile devices, this certification is a jump-start for achieving the same.
* If you do not have too much of an idea about technologies like Wireless Programming or Game Programming, preparing for this certification will be a great incentive to get a foothold in this technology.
* If you want to learn and master the cutting-edge technologies that are round the corner.

This article will provide a basic understanding of the scope of the certification and will also give details about the certification along with the useful resources to get started.

What you need to know?

The Sun Certified Mobile Application Developer (SCMAD) tests the developer’s knowledge in the following five specifications.

* Java Technology for the Wireless Industry (JTWI 1.0) JSR-185
* Connected, Limited, Device Configuration (CLDC 1.0/1.1) JSR-030/JSR-139
* Mobile Information Device Profile (MIDP 2.0) JSR-118
* Wireless Messaging API (WMA 1.1) JSR-120
* Mobile Media API (MMAPI 1.1) JSR-135

Exam Information

Prerequisites

You should have passed the Sun Certified Java Programmer (SCJP) – any version – to appear for this exam.

Objectives

Details of the certification objectives can be found at the Sun website. The major objectives are

1. JTWI (JSR 185) and Overview / JTWI-compliant Wireless Applications
2. CLDC 1.0/1.1
3. Security (both CLDC and MIDP)
4. Networking
5. Application Model/ Delivery/Lifecycle/Provisioning
6. MIDP Persistent Storage
7. Push Registry
8. MIDP UI API
9. MIDP Game API
10. Media using MIDP 2.0 and the Mobile Media API 1.1 (MMAPI)
11. Wireless Messaging API 1.1 (WMA)

Passing Score & Time

The exam consists of 68 questions, and the pass percentage is 55%. The time allotted for the exam is 150 minutes. The fee for the exam is $150 for U.S. candidates and might vary for other countries. Please visit the Sun certification website for more details.

What you need to do?

This exam might be slightly tougher than the other exams since the technology is fairly new and the community support is limited.

Since the API set is relatively small, you can cover the topics quickly. On an average, it should take you around a week to cover each topic if you are familiar with Java and have written or attempted to write a few programs using J2ME. If you are new to wireless programming, you might want to allocate around 2-3 months (2 hours a day at least) for preparation.

To prepare for the exam, it we recommend that you do the following:

1. Download the Java Wireless Toolkit (2.0 or above).
2. If you have a J2ME enabled cell phone, download the toolkit from that vendor. For example, Nokia and Sony provide free toolkits and emulators that will help you program in those devices.
3. Download the PDF versions of the specifications mentioned above.

The best way to get an idea about wireless programming is to actually write some programs and deploy them to a cell phone. This will go a long way in your preparation, since the creation and delivery of wireless applications is quite different from that of the standard or enterprise applications. You should develop at least one program for each specification to get a feel of the API. Some of the programs you might want to develop are

1. A program that takes a name and prints out “Hello ” (tests the UI API)
2. A simple game or a drawing, like the traditional Paddleball game or various geometric shapes moving in the screen (tests the Game API)
3. A program to read an image off a website and display it on the phone (tests networking)
4. A ‘signed’ Hello World! Application (tests security)
5. A program that plays a simple tune (tests MMAPI)
6. A program that displays a text message (tests WMA)
7. A program that calculates tip for various pre-defined scenarios (tests RMS)

To understand the concepts of J2ME programming, you can read the official J2ME tutorial, which is very comprehensive. Some useful books are also listed in the resources section.

Most of the questions will be code-based and hence, it is very important that you understand how the code is structured for various specifications.

You might want to consider purchasing the SCMAD exam simulator by Whizlabs, which contains numerous questions of varying difficulty levels spread across five mock exams and a quiz and also lots of useful tips for the exam.

Assuming that you have Whizlabs SCMAD Exam Simulator, {available at http://www.whizlabs.com/articles/scmad-article.html} you can use the table below as a starting point for developing your preparation timeline.

Week Objective Notes
0 Diagnostic Exam Gives you a feel of what to expect
1 CLDC 1.0/1.1 Basics of the VM and its requirements
2 Application Lifecycle/Provisioning Basics of MIPD and its requirements
3-4 MIDP UI API Develop and deploy Program 1 after this
5 Networking Develop and deploy Program 3 after this
6 MIDP Persistent Storage Develop and deploy Program 7 after this
7-8 MIDP Game API Develop and deploy Program 2 after this
9 MMAPI Develop and deploy Program 5 after this
10 WMA Develop and deploy Program 6 after this
11 Push Registry Small, but complicated, and is related to WMA
11 Security Helps in packaging applications securely
12 JTWI Gives an understanding of how the technologies are tied together
12 Mock Exams Test your preparation. Revise weaker sections.

Finally, you can take some mock exams to prepare yourself from a certification standpoint.

Tips and pitfalls

* Understand the conceptual difference between a J2ME configuration (like CLDC) and a J2ME profile (like MIDP).
* Memorize the software and hardware requirements of the various specifications.
* Memorize the class hierarchies of important APIs like Generic Connection Framework, High level and low level UI API, Media classes of MIDP 2.0 and MMAPI 1.1.
* The exam tests the understanding of the features and differences between low level and high-level API for UI programming. So, learn them well.
* Understand that differences between the media support of MIDP 2.0 and the MMAPI 1.1.
* Apart from the knowledge of writing proper J2ME code, the exam also tests the ability to write valid Java Application Descriptor (JAD) and manifest files that are used to markup the deployment details of an application. So, practice writing the descriptors well and deploy the application in the toolkit to understand the behavior of various deployment tags.
* Remember that the exam is a vendor neutral exam. So, you can safely ignore learning the vendor APIs (like the APIs provided by Nokia, Sony Ericsson, and so on).
* The best reading material for this exam is the specification document. So, allot enough time to go through these documents and learn the concepts and API well.
* Make sure what you read is relevant to the exam objectives. A common pitfall is reading and spending time on things that you may not require for the exam.
* As mentioned earlier, try to develop a practical example for each concept as this will help in a better understanding of the concepts.

Conclusion

With a huge customer base and vast popularity, wireless devices are here to stay. As the wireless technology improves and as the next generation devices come into the market with increased bandwidth, the demand for interactive and feature-rich wireless applications will greatly increase. Having a know-how of wireless programming will give you a competitive edge and will prepare you for the future.

Resources

* J2ME tutorial by Sun Microsystems, Inc.
* SCMAD.com provides a comprehensive list of preparatory resources for the certification exam.
* Exam notes by Sathya Srinivasan, to get you started on the certification

Forums

* Whizlabs SCMAD Certification Forum
* SCMAD trail at JavaRanch (You might also want to visit the J2MEtrail)

Books

* Wireless Java: Developing with J2ME by Jonathan Knudsen
* Wireless J2ME Platform Programming by Vartan Piroumian
* J2ME: The Complete Reference by James Keogh
* Enterprise J2ME: Developing Mobile Java Applications by Michael Juntao Yuan

Exam Simulators

There are many ways to prepare for certification exams, one of them being through the use of exam simulators. With these you cannot just identify your weak areas, but also get a feel of the test environment.

Whizlabs has launched worlds’ first-ever SCMAD (J2ME Certification) exam simulator {available at http://www.whizlabs.com/articles/scmad-article.html}that ensures your success in the exam with its high-quality mock tests and quick revision tips for the exam.

Wednesday, April 16, 2008

Biometrics

ABSTRACT

Biometric identification refers to identifying an individual based on his/her distinguishing physiological and/or behavioural characteristics. As these characteristics are distinctive to each and every person, biometric identification is more reliable and capable than the traditional token based and knowledge based technologies differentiating between an authorized and a fraudulent person. This paper discusses the mainstream biometric technologies and the advantages and disadvantages of biometric technologies, their security issues and finally their applications in day today life.

INTRODUCTION:

“Biometrics” are automated methods of recognizing an individual based on their physical or behavioral characteristics. Some common commercial examples are fingerprint, face, iris, hand geometry, voice and dynamic signature. These, as well as many others, are in various stages of development and/or deployment. The type of biometric that is “best ” will vary significantly from one application to another. These methods of identification are preferred over traditional methods involving passwords and PIN numbers for various reasons: (i) the person to be identified is required to be physically present at the point-of-identification; (ii) identification based on biometric techniques obviates the need to remember a password or carry a token. Biometric recognition can be used in identification mode, where the biometric system identifies a person from the entire enrolled population by searching a database for a match.

A BIOMETRIC SYSTEM:

All biometric systems consist of three basic elements:

* Enrollment, or the process of collecting biometric samples from an individual, known as the enrollee, and the subsequent generation of his template.
* Templates, or the data representing the enrollee’s biometric.
* Matching, or the process of comparing a live biometric sample against one or many templates in the system’s database.

Enrollment

Enrollment is the crucial first stage for biometric authentication because enrollment generates a template that will be used for all subsequent matching. Typically, the device takes three samples of the same biometric and averages them to produce an enrollment template. Enrollment is complicated by the dependence of the performance of many biometric systems on the users’ familiarity with the biometric device because enrollment is usually the first time the user is exposed to the device. Environmental conditions also affect enrollment. Enrollment should take place under conditions similar to those expected during the routine matching process. For example, if voice verification is used in an environment where there is background noise, the system’s ability to match voices to enrolled templates depends on capturing these templates in the same environment. In addition to user and environmental issues, biometrics themselves change over time. Many biometric systems account for these changes by continuously averaging. Templates are averaged and updated each time the user attempts authentication.

Templates

As the data representing the enrollee’s biometric, the biometric device creates templates. The device uses a proprietary algorithm to extract “features” appropriate to that biometric from the enrollee’s samples. Templates are only a record of distinguishing features, sometimes called minutiae points, of a person’s biometric characteristic or trait. For example, templates are not an image or record of the actual fingerprint or voice. In basic terms, templates are numerical representations of key points taken from a person’s body. The template is usually small in terms of computer memory use, and this allows for quick processing, which is a hallmark of biometric authentication. The template must be stored somewhere so that subsequent templates, created when a user tries to access the system using a sensor, can be compared. Some biometric experts claim it is impossible to reverse-engineer, or recreate, a person’s print or image from the biometric template.

Matching

Matching is the comparison of two templates, the template produced at the time of enrollment (or at previous sessions, if there is continuous updating) with the one produced “on the spot” as a user tries to gain access by providing a biometric via a sensor. There are three ways a match can fail:

* Failure to enroll.
* False match.
* False nonmatch.

Failure to enroll (or acquire) is the failure of the technology to extract distinguishing features appropriate to that technology. For example, a small percentage of the population fails to enroll in fingerprint-based biometric authentication systems. Two reasons account for this failure: the individual’s fingerprints are not distinctive enough to be picked up by the system, or the distinguishing characteristics of the individual’s fingerprints have been altered because of the individual’s age or occupation, e.g., an elderly bricklayer. In addition, the possibility of a false match (FM) or a false nonmatch (FNM) exists. These two terms are frequently misnomered “false acceptance” and “false rejection,” respectively, but these terms are application-dependent in meaning. FM and FNM are application-neutral terms to describe the matching process between a live sample and a biometric template. A false match occurs when a sample is incorrectly matched to a template in the database (i.e., an imposter is accepted). A false non-match occurs when a sample is incorrectly not matched to a truly matching template in the database (i.e., a legitimate match is denied). Rates for FM and FNM are calculated and used to make tradeoffs between security and convenience. For example, a heavy security emphasis errs on the side of denying legitimate matches and does not tolerate acceptance of imposters. A heavy emphasis on user convenience results in little tolerance for denying legitimate matches but will tolerate some acceptance of imposters.

BIOMETRIC TECHNOLOGIES:

The function of a biometric technologies authentication system is to facilitate controlled access to applications, networks, personal computers (PCs), and physical facilities. A biometric authentication system is essentially a method of establishing a person’s identity by comparing the binary code of a uniquely specific biological or physical characteristic to the binary code of an electronically stored characteristic called a biometric. The defining factor for implementing a biometric authentication system is that it cannot fall prey to hackers; it can’t be shared, lost, or guessed. Simply put, a biometric authentication system is an efficient way to replace the traditional password based authentication system. While there are many possible biometrics, at least eight mainstream biometric authentication technologies have been deployed or pilot-tested in applications in the public and private sectors and are grouped into two as given,

* Contact Biometric Technologies
o fingerprint,
o hand/finger geometry,
o dynamic signature verification, and
o keystroke dynamics
* Contactless Biometric Technologies
o facial recognition,
o voice recognition
o iris scan,
o retinal scan,

CONTACT BIOMETRIC TECHNOLOGIES:

For the purpose of this study, a biometric technology that requires an individual to make direct contact with an electronic device (scanner) will be referred to as a contact biometric. Given that the very nature of a contact biometric is that a person desiring access is required to make direct contact with an electronic device in order to attain logical or physical access. Because of the inherent need of a person to make direct contact, many people have come to consider a contact biometric to be a technology that encroaches on personal space and to be intrusive to personal privacy.

Fingerprint

The fingerprint biometric is an automated digital version of the old ink-and-paper method used for more than a century for identification, primarily by law enforcement agencies. The biometric device involves users placing their finger on a platen for the print to be read. The minutiae are then extracted by the vendor’s algorithm, which also makes a fingerprint pattern analysis. Fingerprint template sizes are typically 50 to 1,000 bytes. Fingerprint biometrics currently have three main application arenas: large-scale Automated Finger Imaging Systems (AFIS) generally used for law enforcement purposes, fraud prevention in entitlement pro-grams, and physical and computer access.

Hand/Finger Geometry

Hand or finger geometry is an automated measurement of many dimensions of the hand and fingers. Neither of these methods takes actual prints of the palm or fingers. Only the spatial geometry is examined as the user puts his hand on the sensor’s surface and uses guiding poles between the fingers to properly place the hand and initiate the reading. Hand geometry templates are typically 9 bytes,

and finger geometry templates are 20 to 25 bytes. Finger geometry usually measures two or three fingers. Hand geometry is a well-developed technology that has been thoroughly field-tested and is easily accepted by users.

Dynamic Signature Verification

Dynamic signature verification is an automated method of examining an individual’s signature. This technology examines such dynamics as speed, direction, and pressure of writing; the time that the stylus is in and out of contact with the “paper”; the total time taken to make the signature; and where the stylus is raised from and lowered onto the “paper.” Dynamic signature verification templates are typically 50 to 300 bytes.

Keystroke Dynamics

Keystroke dynamics is an automated method of examining an individual’s keystrokes on a keyboard. This technology examines such dynamics as speed and pressure, the total time of typing a particular password, and the time a user takes between hitting certain keys. This technology’s algorithms are still being developed to improve robustness and distinctiveness. One potentially useful application that may emerge is computer access, where this biometric could be used to verify the computer user’s identity continuously.

CONTACTLESS BIOMETRIC TECHNOLOGIES:

A contactless biometric can either come in the form of a passive (biometric device continuously monitors for the correct activation frequency) or active (user initiates activation at will) biometric. In either event, authentication of the user biometric should not take place until the user voluntarily agrees to present the biometric for sampling. A contactless biometric can be used to verify a persons identity and offers at least two dimension that contact biometric technologies cannot match. A contactless biometric is one that does not require undesirable contact in order to extract the required data sample of the biological characteristic and in that respect a contactless biometric is most adaptable to people of variable ability levels.

Facial Recognition

Facial recognition records the spatial geometry of distinguishing features of the face. Different vendors use different methods of facial recognition, however, all focus on measures of key features. Facial recognition templates are typically 83 to 1,000 bytes. Facial recognition technologies can encounter performance problems stemming from such factors as no cooperative behavior of the user, lighting, and other environmental variables. Facial recognition has been used

in projects to identify card counters in casinos, shoplifters in stores, criminals in targeted urban areas, and terrorists overseas.

Voice Recognition

Voice or speaker recognition uses vocal characteristics to identify individuals using a pass-phrase. Voice recognition can be affected by such environmental factors as background noise. Additionally it is unclear whether the technologies actually recognize the voice or just the pronunciation of the pass-phrase (password) used. This technology has been the focus of considerable efforts on the part of the telecommunications industry and NSA, which continue to work on

improving reliability. A telephone or microphone can serve as a sensor, which makes it a relatively cheap and easily deployable technology.

Iris Scan

Iris scanning measures the iris pattern in the colored part of the eye, although the iris color has nothing to do with the biometric. Iris patterns are formed randomly. As a result, the iris patterns in your left and right eyes are different, and so are the iris patterns of identical-cal twins. Iris scan templates are typically around 256 bytes. Iris scanning can be used quickly for both identification and verification

Applications because of its large number of degrees of freedom. Current pilot programs and applications include ATMs (“Eye-TMs”), grocery stores (for checking out), and the few International Airports (physical access).

Retinal Scan

Retinal scans measure the blood vessel patterns in the back of the eye. Retinal scan templates are typically 40 to 96 bytes. Because users perceive the technology to be somewhat intrusive, retinal scanning has not gained popularity with end-users. The device involves a light source shined into the eye of a user who must be standing very still within inches of the device. Because the retina can change with certain medical conditions, such as pregnancy, high blood pressure, and AIDS, this biometric might have the potential to reveal more information than just an individual’s identity.

Emerging biometric technologies:

Many inventors, companies, and universities continue to search the frontier for the next biometric that shows potential of becoming the best. Emerging biometric is a biometric that is in the infancy stages of proven technological maturation. Once proven, an emerging biometric will evolve in to that of an established biometric. Such types of emerging technologies are the following:

* Brainwave Biometric
* DNA Identification
* Vascular Pattern Recognition
* Body Odor Recognition
* Fingernail Bed Recognition
* Gait Recognition
* Handgrip Recognition
* Ear Pattern Recognition
* Body Salinity Identification
* Infrared Fingertip Imaging & Pattern Recognition

SECURITY ISSUES:

The most common standardized encryption method used to secure a company’s infrastructure is the Public Key Infrastructure (PKI) approach. This approach consists of two keys with a binary string ranging in size from 1024-bits to 2048-bits, the first key is a public key (widely known) and the second key is a private key (only known by the owner). However, the PKI must also be stored and inherently it too can fall prey to the same authentication limitation of a password, PIN, or token. It too can be guessed, lost, stolen, shared, hacked, or circumvented; this is even further justification for a biometric authentication system. Because of the structure of the technology industry, making biometric security a feature of embedded systems, such as cellular phones, may be simpler than adding similar features to PCs. Unlike the personal computer, the cell phone is a fixed-purpose device. To successfully incorporate Biometrics, cell-phone developers need not gather support from nearly as many groups as PC-application developers must. Security has always been a major concern for company executives and information technology professionals of all entities. A biometric authentication system that is correctly implemented can provide unparalleled security, enhanced convenience, heightened accountability, superior fraud detection, and is extremely effective in discouraging fraud. Controlling access to logical and physical assets of a company is not the only concern that must be addressed. Companies, executives, and security managers must also take into account security of the biometric data (template). There are many urban biometric legends about cutting off someone finger or removing a body part for the purpose of gain access. This is not true for once the blood supply of a body part is taken away, the unique details of that body part starts to deteriorate within minutes. Hence the unique details of the severed body part(s) is no longer in any condition to function as an acceptable input for scanners. The best overall way to secure an enterprise infrastructure, whether it be small or large is to use a smart card. A smart card is a portable device with an embedded central processing unit (CPU). The smart card can either be fashioned to resemble a credit card, identification card, radio frequency identification (RFID), or a Personal Computer Memory Card International Association (PCMCIA) card. The smart card can be used to store data of all types, but it is commonly used to store encrypted data, human resources data, medical data, financial data, and biometric data (template). The smart card can be access via a card reader, PCMCIA slot, or proximity reader. In most biometric-security applications, the system itself determines the identity of the person who presents himself to the system. Usually, the identity is supplied to the system, often by presenting a machine-readable ID card, and then the system asked to confirm. This problem is "one-to- one matching." Today's PCs can conduct a one-to-one match in, at most, a few seconds. One-to-one matching differs significantly from one-to-many matching. In a system that stores a million sets of prints, a one-to-many match requires comparing the presented fingerprint with 10 million prints (1 million sets times 10 prints/set). A smart card is a must when implementing a biometric authentication system; only by the using a smart card can an organization satisfy all security and legal requirements. Smart cards possess the basic elements of a computer (interface, processor, and storage), and are therefore very capable of performing authentication functions right on the card. The function of performing authentication within the confines of the card is known as ‘Matching on the Card (MOC)’. From a security prospective MOC is ideal as the biometric template, biometric sampling and associated algorithms never leave the card and as such cannot be intercepted or spoofed by others (Smart Card Alliance). The problem with smart cards is the public-key infrastructure certificates built into card does not solve the problem of someone stealing the card or creating one. A TTP (Trusted Third Party) can be used to verify the authenticity of a card via an encrypted MAC (Media Access Control).

CULTURAL BARRIERS/PERCEPTIONS:

People as diverse as those of variable abilities are subject to many barriers, theories, concepts, and practices that stem from the relative culture (i.e. stigma, dignity or heritage) and perceptions (i.e. religion or philosophical) of the international community. These factors are so great that they could encompass a study of their own. To that end, it is also theorized that to a certain degree that the application of diversity factors from current theories, concepts, and practices may be capable of providing a sturdy framework to the management of employees with disabilities. Moreover, it has been implied that the term diversity is a synonymous reflection of the initiatives and objectives of affirmative action policies. The concept of diversity in the workplace actually refers to the differences embodied by the workforce members at large. The differences between all employees in the workforce can be equated to those employees of different or diverse ethnic origin, racial descent, gender, sexual orientation, chronological maturity, and ability; in effect minorities.

ADVANTAGES OF BIOMETRIC TECHNOLOGIES:

Biometric technologies can be applied to areas requiring logical access solutions, and it can be used to access applications, personal computers, networks, financial accounts, human resource records, the telephone system, and invoke customized profiles to enhance the mobility of the disabled. In a business-to-business scenario, the biometric authentication system can be linked to the business processes of a company to increase accountability of financial systems, vendors, and supplier transactions; the results can be extremely beneficial. The global reach of the Internet has made the services and products of a company available 24/7, provided the consumer has a user name and password to login. In many cases the consumer may have forgotten his/her user name, password, or both. The consumer must then take steps to retrieve or reset his/her lost or forgotten login information. By implementing a biometric authentication system consumers can opt to register their biometric trait or smart card with a company’s business-to-consumer e-commerce environment, which will allow a consumer to access their account and pay for goods and services (e-commerce). The benefit is that a consumer will never lose or forget his/her user name or password, and will be able to conduct business at their convenience. A biometric authentications system can be applied to areas requiring physical access solutions, such as entry into a building, a room, a safe or it may be used to start a motorized vehicle. Additionally, a biometric authentication system can easily be linked to a computer-based application used to monitor time and attendance of employees as they enter and leave company facilities. In short, contactless biometrics can and do lend themselves to people of all ability levels.

DISADVANTAGES OF BIOMETRIC TECHNOLOGIES:

Some people, especially those with disabilities may have problems with contact biometrics. Not because they do not want to use it, but because they endure a disability that either prevents them from maneuvering into a position that will allow them to make use the biometric or because the biometric authentication system (solution) is not adaptable to the user. For example, if the user is blind a voice biometric may be more appropriate.

BIOMETRIC APPLICATIONS:

Most biometric applications fall into one of nine general categories:

* Financial services (e.g., ATMs and kiosks).
* Immigration and border control (e.g., points of entry, precleared frequent travelers, passport and visa issuance, asylum cases).
* Social services (e.g., fraud prevention in entitlement programs).
* Health care (e.g., security measure for privacy of medical records).
* Physical access control (e.g., institutional, government, and residential).
* Time and attendance (e.g., replacement of time punch card).
* Computer security (e.g., personal computer access, network access, Internet use, e-commerce, e-mail, encryption).
* Telecommunications (e.g., mobile phones, call center technology, phone cards, televised shopping).
* Law enforcement (e.g., criminal investigation, national ID, driver’s license, correctional institutions/prisons, home confinement, smart gun).

CONCLUSION:

Currently, there exist a gap between the number of feasible biometric projects and knowledgeable experts in the field of biometric technologies. The post September 11 th, 2002 attack (a.k.a. 9-11) on the World Trade Center has given rise to the knowledge gap. Post 9-11 many nations have recognized the need for increased security and identification protocols of both domestic and international fronts. This is however, changing as studies and curriculum associated to biometric technologies are starting to be offered at more colleges and universities. A method of closing the biometric knowledge gap is for knowledge seekers of biometric technologies to participate in biometric discussion groups and biometric standards committees. The solutions only needs the user to possess a minimum of require user knowledge and effort. A biometric solution with minimum user knowledge and effort would be very welcomed to both the purchase and the end user. But, keep in mind that at the end of the day all that the end users care about is that their computer is functioning correctly and that the interface is friendly, for users of all ability levels. Alternative methods of authenticating a person’s identity are not only a good practice for making biometric systems accessible to people of variable ability level. But it will also serve as a viable alternative method of dealing with authentication and enrollment errors. Auditing processes and procedures on a regular basis during and after installation is an excellent method of ensuring that the solution is functioning within normal parameters. A well-orchestrated biometric authentication solution should not only prevent and detect an impostor in instantaneous, but it should also keep a secure log of the transaction activities for prosecution of impostors. This is especially important, because a great deal of ID theft and fraud involves employees and a secure log of the transaction activities will provide the means for prosecution or quick resolution of altercations.

Sunday, April 13, 2008

Ten Great Careers For Computer “Geeks.”

The universal acceptance of computers into our daily lives, both at work and at home, has decreased the image of computer users as being “geeks.” The word geek itself has evolved a bit - going from meaning a socially inept person who gets along better with computers than people, to someone who is an expert with computers, a guru even. In fact, many computer service companies utilize the name geek in their nomenclature because of this new meaning.

Not everyone who is proficient in using a computer is a geek, but there are people out there who are so interested in computers and so well versed in them, they wear the title geek with pride. Many of these people may not have had formal training. They’ve been playing with computer hardware, or software since they were ten years old. So what should you do if you have this kind of computer knowledge? A few years ago, it was very easy to get a well paying computer job, without any post secondary education. Advances in technology, the dot-com implosion and wider acceptance of technology doesn’t make it so easy any more. The good news is, you don’t need a four year degree to secure a well paying job in the computer field. Even if you’re not a self professed computer geek, if you have an interest in a computer career, here are some good fields to study.

Computer Networking

Computer networking jobs entail designing, repairing and maintaining PC networks, usually in a business setting. There is no industry standard for software, but Microsoft dominates, with Novell taking a distant second place. Cisco dominates the category in hardware routers. Courses of study available include A+ (basic computer hardware), MCSE (Microsoft Certified Systems Engineer), MCSA (Microsoft Certified Systems Administrator), Novell Netware and Cisco Certification.

Career positions in this category include network design, network administration and network security. Depending on the employer, a computer networking professional may do all, or some of these duties.

Computer Security

Computer security is another growing field. Many businesses have created networks, websites and become reliant on computer technology, without employing safeguards to protect their data. There are many malevolent computer geeks out there who attack systems, or software for fun, curiosity or profit. Data extortion is now a common organized crime method for the Russian mafia!

Security violations have created new careers in network security and software development. Courses of study are mainly in Microsoft products and software development languages like Visual Basic, C++, .net, compiler and assembly languages.

Career positions in this category include network security, software programming, web design, web development and website administration (server side).

Databases

The acceptance of computers into business has created a great demand for databases. Almost every industry has a need for databases for marketing, client retention and daily operations. Industries such as banking, insurance, hospitals and utilities absolutely rely on them. Terrorism threats have created new laws, like the Patriot Act, that require a database of all foreign nationals who enter the country.

Creation of these databases relies on software, mainly developed by Oracle for large scale databases, Microsoft SQL for web based applications and Microsoft Access for smaller scale and custom applications.

Jobs in the database category include data architects, database administrators and information systems managers.

The information age has created a wealth of career opportunities for computer geeks, elevating their status as knowledgeable professionals and compensating them well financially. If you’re a computer geek, or would like to become one, a career in any of these professions can be obtained in less than two years of study.

Friday, April 11, 2008

Experience the Strange and Twisted World of Internet Cafes

Being a novice of the Internet and realms of technology in general, the idea of hanging out in an Internet café scared me about as much as it used to when I first sat in front a computer trying to figure out how to turn the thing on. Pictures of super-obese, nerdy, snotty-nosed tech-heads filled my head as well as visions of stagnant, sterile rooms crammed with blue screens flashing the latest comic heroes and teen idols. Well, I just went on a trip to several countries around the world and I ventured into this unknown sector, as I wanted to keep in touch with people back home. Although I admittedly didn’t visit many of these cafes (I needed a break from the computer worlds of school and work), the few I did see were quite different from what I expected.

First stop was a café in Prague, Czech Republic. This was hidden down an alley but was next to restaurant so it seemed safe enough. I paid the equivalent of just over $1.00 U.S. for a half hour on the machine-not too bad eh? Now I don’t know if I was attacked by an evil spirit or just had something with my brain that day, but for some reason it took 25 minutes to just get into my hotmail! First of all, the keyboard was different, and in weird ways I tell you. If you hit the Y key it came out as a Z and vice versa. I was surprised later in London that their keyboard was also different from the ones’ back home. I thought keyboards were universally the same around the planet! I finally got into my hotmail, sent a one sentence email and left. Looking around me as I departed I saw that most of the people using computers were so-called normal folks like myself, mainly backpackers and tourists, but also businessmen etc. If I had more patience that day it probably would have been a nice atmosphere to be part of, I saw that they served coffee and tea, but as I was in a beautiful foreign place I needed to get out under the Sun amidst the real action.

My other main experience in the public Internet world was to be in the North of London in a suburb called Neesdon or ‘Sneezdon’ as the Aussie mates I met up with liked to call it. Here, the Café was totally different from the previous one and I assume –I don’t like to assume-that like people, each café is an individual entity with both positive and negative traits as well as bonuses and letdowns. This ‘café’ was in the back of a mobile (cellular) phone shop and was just a tiny room with a handful of computers that no one else seemed to be interested in, everyone looked more focused on phones, but I have a weird feeling they might have been dealing something else ‘under the counter’ that was sparking hot interest.

We paid 1 pound (approx. $1.75U.S.) for an hour. Three of us walked in and we sat down at two computers, I think this was probably disallowed but again no one was paying any attention to us. My friend said he had been to this particular café a few times previously as he was living in the area for a year, and had seen and experienced some undesirable situations. One day he believes he saw a cannabis deal go down, and on another occasion a drunken man came in and told him he didn’t like the look of him and would he go outside for a fight? The confrontation dissipated with the use of calm communication but you can imagine that this type of Internet café would be rated poorly on a world standard. So, beware where you enter the virtual world my friends, reality may just arrive to smash you in the face!

In the end, I think like all things in life these Internet cafes need to be approached with optimism and hope, as one never truly knows what there is to offer in a new experience. I imagine that there is a multitude of different types of cafes-some you may even deem as cool places to gravitate. If strange and twisted is what you look for, or a common normality, I believe both can be found in the cafes of the future.

Monday, April 7, 2008

History of World / Regional Search Engines and Directories

History of World / Regional Search Engines and Directories
by: Julie Wartes & Joel Ennis


Computers have become a way of life for people around the world. They are used to research term papers, check weather forecasts, track military progress, exchange ideas (blogs and chat) and to find the cheapest price on items etc. It is no surprise that as the computer age takes hold computer usage has increased. The number of websites that are being developed on the World Wide Web is growing at an ever increasing exponential amount. And because we live in a quick-fix society, with limited time on our hands, we need something to make surfing the web a lot easier, something that will sort out all this influx of information into a logical order. Hence the wonder of search engines, which have transformed the meaning of search and has made our jobs easier. This paper focuses on search engine case studies around the world. Different search engine regions selected for research in this paper were: American, European, Canadian, Australian and the UK.

American Search Engines

Introduction

As of January 1, 2004 the estimated human population of the United States of America was 292,287,428 and there were 164,100,000 computers (1). An estimated 90,000,000 households have access to internet (2). With this many people surfing the web, curiosity might cause one to wonder about the search engines being used in America. Two of the most popular and well known search engines in America are Google and Yahoo. Although most people have a preference to one or the other search engine it is difficult to say one is truly better than the other as the quality of search is similar. According to some users, Google is the most powerful search engine in the world (3). However, others users argue that between the two search engines, “There is an imperceptible difference in the quality and relevancy of results.” (3).

History of Google

Google, www.google.com, was created by two Stanford University students. The original name was BackRub but by 1998 the name had been changed to Google and it became a private company (4). A Googol is (10100), the word Google was created as a variation (5). Some argue that Google is the top search engine in the world and indeed it has become so popular that it is now known as a verb (5).

General Information

Google has grown from indexing 25,000,000 pages in November of 1998 to 4,285,199,774 pages in of June 2004 (6). Google is a crawler based search engine, meaning that the search engine sends out crawlers which automatically visit sites, follow links and create indexes which are then searchable (7). Google is currently in a legal battle with the court systems to keep its name out of the dictionary (5). Google was the first major search engine to index non-HTML web content allowing PDF files to be accessed (8). Google also offers services such as “cached” links that allows a user to view older versions of pages. Other features include spell check, dictionary definitions, integration of stock quotes, street maps, and telephone numbers (4).

Google Homepage

The home page offers a user many convenient features. Google has an easy to use tool bar that users can download onto their personal computers (3). If a user wants to search specifically for pictures the “Images” button will give them access to the entire web but only those results containing images will show up. The “Groups” button allows users to participate in discussions that are taking place on Usenet newsgroups. The “News” button will provide headline news stories and you can have news alerts on specific subjects sent directly to your email. “Froogle” is a simple, easy way for a user to search for a bargain or even a specific item online. The “more” link provides access to human-compiled information from the “Directory”, allows a user to get travel information or even simply use a calculator. The “Advanced Search” will allow a user to limit a search to PDFs or .edu search results. “Preferences” allows customization of Google services. “Language Tools” will translate any website or article into an understandable language. If you’re curious about Advertising, Business Solutions or about Google, click below the search box on one of the links. The “I’m Feeling Lucky” button will take a user directly to the first page result of their search and not show any other pages. Google lists its sponsored links separately from its general search results.

History of Yahoo

Yahoo, www.yahoo.com, was created in 1994 (4). Most people are familiar with the phrase, “Do You Yahoo?” created as a Yahoo advertising slogan.

General Information

In 2004 Yahoo is believed to be indexing about 3 billion pages (9). Yahoo is the oldest directory, “relying on human editors to organize web sites into categories” however, Yahoo decided to make a giant shift towards crawler based listings in October of 2002 (4). Yahoo previously relied on Google for search results but in February of 2004 Yahoo created its own search technology (4). “It is important to note the new search engine is for web results only. Image search is still powered by Google, and News search is a combination of Yahoo’s own editorial and technological resources.” (10). Yahoo has many of the same features Google provides such as “cashed” links, spell check, dictionary, yellow pages, and mapping programs. Yahoo also provides email services for free as long as the user doesn’t mind sharing the page with advertisements. In fact, Yahoo’s knowledge and experience from the email services are believed to be helpful in providing Yahoo with a deeper understanding of spam and an ability to keep it out of their web page index (10).

Yahoo Homepage

Yahoo’s home page is cluttered compared to the simple format of Google. Yahoo has the typical advanced search options and preferences allowing for customization of features. To the right of the search box is a pull tab that allows a user to limit their search to the web, images, yellow pages, news, or product information. A user can look up movie times, send yahoo greetings, search for a house, or find travel information, just to name a few things. Yahoo’s website directory has sites organized by subjects such as, education, science, health, and government. Towards the bottom of the page is a list of local Yahoo’s. Who knew that there was a Yahoo Singapore let alone a Yahoo Catalan. Like Google, Yahoo lists its sponsored links separately from its general search results.

European Search Engines

Introduction

There are 44 nations in the European region. These countries are home to 567,095,995 people, 9.33% of the world’s population (11). Each of these countries has at least one regional public search engine. Some of them have several (12). Booz-Allen & Hamilton, one of the United States’ older and more respected consulting firms conducted a study which pointed out that “…22% of European households have access to the Internet” and in 2004 it is expected that, “Western Europe should have more than 215,000,000 users on the Internet on at least a quarterly basis (13).”

The countries of Europe have been involved in a long and complex process of European integration. The EU was formed 1 November 1993 and is a union of 25 independent states (Appendix 1) (14). As one of its objectives; the EU has taken on the challenge of integrating member states and surrounding countries to provide them with stable Internet and E-commerce development and growth (15). Alltheweb and Ezilon are two of the many search engine companies that are seeking to capitalize on the opportunity provided by the EU integration.

History of Alltheweb

Fast Search and Transfer (Fast) ASA started business operations in 1997 in Oslo, Norway. In June 2001 they went public. Fast is an outgrowth of academic research and development from the Norwegian University of Science and Technology (16). Fast Search and Transfer launched Alltheweb in May 1999 and gave it the brand name of “Fast” until 2001, when the site was redesigned under the “Alltheweb” brand. Alltheweb, www.alltheweb.com, then became a public search engine and technology showcase for the Fast Search Company (17). In February of 2003, Overture announced they would purchase Alltheweb from Fast. The acquisition, coupled with their previous announcement to purchase the AltaVista Company, helped Overture to bolster their leadership in commercial search and aligned the company to create the most comprehensive search capability on the Internet (18). Yahoo! Corporation bought Overture on October 7, 2003 to fortify itself for an eventual showdown with Google and Microsoft (19).

General Information

Alltheweb now has all of the indexing power of Google. Through the acquisition process, Alltheweb has increased its prowess to become a leader in the search engine community throughout Europe (19). Its employees strive to help Yahoo provide the best public search engine not just in Europe, but the world.

Alltheweb Homepage

Pulling up the Alltheweb homepage (20) the user is provided with a simple and easy to navigate site. Alltheweb homepage contains five tabs: web, news, pictures, video and audio. However, the user has to search for a specific topic to find information in these categories rather than information being available for them on the site. All “sponsored links” are viewed to the right of the page where they are available if desired. Only after glancing through all of the paid links does the viewer come to the most relevant search returns. On completion of a search for a topic the user is provided with an option to “Refine your search” which provides further subcategories on the topic. Other features offered on Alltheweb are the ability to filter potential offensive content, restrict the number of results per page, highlight search terms, change text size, and the ability to open search results in a new window. You can select up to 8 preferred languages from a list of 36.

History of Ezilon

Ezilon.com search engine was founded to allow individuals and companies around the world to access information easily, particularly by the EU community. (21) The Ezilon homepage states that Ezilon.com is a European Union International Search Engine and Directory. (22)

General Information

Ezilon search engine provides a searchable directory for European websites ranging from Europe travel links, business, hotels, news and classified ads and most especially links from the Western European countries. They also accept listings for non EU websites, but they must have contents of interest to the EU or general Europe. (23)

Ezilon Homepage

Ezilon homepage has immediately accessible information on it. Unlike other websites which present “Sponsored links” Ezilon provides “Sponsored matches” prior to the search results. Ezilon’s “Sponsored matches” can be viewed to the right of the page.

Ezilon specializes, through the guidance of the EU, in pooling and distributing information for the advancement of a united Europe. Other convenient features available to users are: popular links, weather forecast and headline news. Various categories are available for search e.g., art, science, business, technology etc. Users have an option to search the EU, UK or the World. In addition to all above mentioned features free email is provided to members of Ezilon.

Canadian Search Engines

Introduction

As of July 1, 2004 estimated population of Canada is 32,247,874 with 16,19,000 internet users (24). Canada today stands at the number 7 position among countries with internet users (25). There are 13 provinces and territories in Canada and the usage of search engines varies by these provinces and territories (26). This section will focus on only two of the top five most commonly used Canadian Search Engines: AltaVista Canada and AOL Canada (27). The following sections will provide a brief history, general information and information about the homepages of these two search engines.

History of AltaVista Canada

AltaVista, which means “a view from above,” was inspired by the creation of big ideas from a team of experts with fascination for keeping track of information (28). During the spring of 1995, scientists at Digital Equipment Corporation’s Research lab in Palo Alto, CA, devised a way to store every word of every HTML page on the internet in a fast searchable index (28). This led to AltaVista’s development of the first searchable, full text database on the World Wide Web (28). Compaq became the owner of AltaVista after it purchased Digital in 1998. AltaVista was later spun off into a private company, controlled by CMGI. Overture, purchased the AltaVista in April 2003, eventually Overture became part of Yahoo which led to the ownership transfer of AltaVista to Yahoo (29). AltaVista Canada was launched in January of 1998. Over 1,500,000 unique users visited the site and traffic to the site grew by over 300% within the first year of its launch. AltaVista Canada was a result of an agreement between TELUS advertising Services and Compaq Computer Corp.’s AltaVista Search Service (30).

General Information

AltaVista was the first to deliver internet’s first Web Index in 1995. It was also the first in offering multilingual search capabilities on the internet and to launch image, audio and video search capabilities. It has been awarded 61 search-related patents, more than any other internet search company (28). AltaVista is great for international sites. It will find sites not only in English on a topic, but in just about every language imaginable. In case the user isn’t bilingual, Alta Vista also includes a very handy translation tool (31).

AltaVista Canada Homepage

The homepage of AltaVista offers users various convenient and easy to use features. The homepage houses Canadian web index of more than 14,000,000 pages. It allows you to search worldwide or within Canada. For the convenience of its users, AltaVista Canada provides a feature to select a language in which to present the results of its search. AltaVista provides a toolbar which is free, customizable and gives the user the research tools to perform searches and translations from their own browser anywhere on the web. Popup blocker feature on the toolbar gets rid of annoying pop-ups. The “Web search” searches within all AltaVista indexes, including the best image search on the web. The homepage also provides specific features that help you narrow your search for only images, music files, videos, news and directory based on specific topic. “Advanced search” option provides different features for search options e.g., phrase, keywords, date, file type, location and sites per page option.

History of AOL Canada

AOL Canada Inc. is a leading interactive service company focused on enhancing Canadians online user experience. The company is a strategic alliance between American Online, Inc., a wholly owned subsidiary of Time Warner and RBC Royal Bank, the personal and commercial banking division of RBC Financial, one of North America’s leading diversified financial services companies. This alliance took place on July, 1999. The company operates three interactive online services tailored to the Canadian marketplace, AOL English and French services, CompuServe and Netscape Online as well as several leading Internet brands including AOL.CA, AOL Canada Search, AOL Instant Messenger (AIM), Netscape.ca and MapQuest (32).

General Information

Members of AOL Canada Inc. have an added advantage. In addition to regular Web access, clients may choose to surf inside of the AOL environment, which is only available to AOL Canada Inc. members. Inside the AOL environment, members receive a great deal of Canadian content that has been streamlined to meet their needs. Surfing on the World Wide Web can sometimes be overwhelming and frustrating due to the excessive amount of information available. AOL Canada Inc. has chosen the most popular channels and listed them on the Channel Menu, each channel offering an abundant amount of information provided by AOL Canada Inc. and their partners on that particular topic (33).

Homepage of AOL Canada

AOL Canada’s homepage presents a variety of features to make search and web surfing easier for its users. Some features presented on its main page are: a sign in for AOL instant messenger, mail, parental control, reminders and calendar etc. It has a section for “Shopping online” which gives its users an option to choose the specific item they are shopping for. “Web channels” feature gives users a chance to search the web for careers, auto, entertainment, health news, sports etc. It also contains a section for yellow and white pages search. Users can search for people or places in this section. AOL Canada search, which is powered by Google, gives users a chance to search by topic or key words. The “Advanced search” feature only has options for words and phrases rather than file type or specific sites e.g. .edu or .org. Search results include “Sponsored links” and matching sites. It also has a “devices” tab where AOL offers Palm Organizers. Other information included on the homepage is company information, news, pricing plans, employment and contact information.

Australian Search Engine

Introduction

Although the internet is globally accessible, the majority of its users reside in the more developed nations where computer usage is most common. Australia has a high percentage of computer usage which resulted in increased usage of the internet and development of search engines. The Australian population is currently 20,141,793 (34) with 10,600,000 (35) computer users. Of which 9,240,000 (24) have internet access.

History of Web Wombat

As computers were being integrated into Australian businesses and homes, companies, such as Web Wombat, began to develop search engines to make information more accessible to Australians and the world. As the internet grew and more pages began to be indexed, Web Wombat turned its focus inward on Australia (36). This strategy proved to be very successful as evident by the fact that today Web Wombat’s spiders crawl the web for new sites to add to its already vast cache of 100,000,000 web sites, making it Australia’s leading search engine (37).

Homepage of Web Wombat

Web Wombat indexes and stores a tremendous amount of information and the way in which they chose to present it is slightly different than other search engines. The lay out of Web Wombat’s home page is initially dominated by advertisements. To the left of the home page there is a box titled “Premium links” which are not organized in any specific order. An example of this is the fact that, of the twenty one listings within the box, there are three separate links to car rental sites, two to hotels, and miscellaneous links to things such as online dating. This information would be more useful to consumers if it were categorized in some fashion within a directory or if it were displayed to the side of a related search. Under the “Premium links” box is the “Daily resources box” which provides links ranging from links to world newspapers to Grumpy’s humor. At the top of the page there are a number of headings ranging from careers and education to auctions. The interesting thing is that when you click on auctions it takes you directly to ebay.com.au. However, the careers and education tab is far more objective. This link provides a menu on the left with a variety of helpful links and then articles with links in the center of the page. The “Search” field is displayed in the center of the page however; it is not very prominent once barraged by ads, premium links and advertisements. The “Advanced search” has an interesting feature which enables people to search by “nice to have words”, “must have words”, “nice to have phrase”, and the ability to eliminate certain words. This feature becomes very convenient when a user wishes to narrow the results to a desired city under “nice to have words”. By focusing the search in such a way, the information delivered is made more relevant without eliminating too many possible matches. Another nice feature is that “Advanced search” enables people to specifically search all Australian pages, Australian government sites, or Australian educational pages.

UK Search Engines

Introduction

The population of UK as of July 2004 is 60,270,708 with 25,000,000 internet users (24). There are approximately 40 top Search engines in the UK; some of them are familiar to the United Stated such as Google, Yahoo and Overture. These search engines are custom suited to have optional searches, whether regional searches within the UK or on the Entire World Wide Web. Some search engines within the UK are more specialized; where you have to pay a subscription to use that particular search engine, for example UK Plus and UK Index. And some search engines like Great British Pages, only search for British websites (39). Being a developed country, it is not surprising that the UK has a large number of search engines. The following section of the paper is focused on UK originated search engines.

History of Excite

Excite was originally formed in 1993 by five Stanford university graduates who called their search engine Architext. This was further developed by students attending Stanford and in 1994 Excite Inc. came about and was launched in 1995. Excite only acquired a web crawler in November 1996 and over the years they added features and software and the option of search on the World Wide Web. Regional search engines such as Excite UK were added to their search engine. Excite has an index of 250,000,000 pages and multimedia items in 2000 and about 19,000,000 monthly users (40).

General Information

Using a specific UK search engine such as Excite UK can be beneficial. Excite UK will limit your search geographically and the users results will come out in a smaller scale providing more relevant results (41). If a user in the UK wants to find out about local news or check the stock of UK companies a regional search engine will provide more relevant information.

Excite Homepage

Features of Excite UK are different compared to the usual “dot com” American search engines. The Excite UK search engine homepage has the search box in a prominent position at the top of the page with options located below to search the World Wide Web or only UK sites. In this respect it is a dual search engine (UK and World Wide Web) (40, 42). Excite UK has a General User Interface (GUI) which is very busy, meaning that it has a lot of information on its homepage ranging from UK news headlines and images, UK stocks, UK online shopping, UK stores, standard horoscopes, local weather, a directory, and various other services targeted for people living in the UK (42). The Excite homepage is also customizable, in that you are able to change the color of the homepage (skins) according to your favorite color or to the colors of your favorite football team. Besides just searching the web on Excite, you can search using “Categories” for images, videos, mp3s, news, and people. The excite UK homepage also has a currency calculator under the “Tools” section. Excite UK displays minimum advertising. When a user clicks on the “More” button different subcategories within a search can be selected (43).

Conclusion

Search engines have become an integral part of internet use for the average computer user. As more information becomes accessible on the internet, the need for credible search engines increases. As Professor Krishnamurthy stated in class, the amount of content available on the internet has grown exponentially in recent years and as it grows people’s search cost will increase unless search technology improves (38). Throughout the world, internet users are provided with various search engines that offer different features depending on their user needs. The fulfillment of these needs result in the success of search engines around the world.

About The Author

Anastacia Chetty

George McNiel

Joel Ennis

Julie Wartes

T.J. Tiwana

University of Washington

This article was posted on August 05, 2004

Read More Articles from the "Computers and Internet" Category:

Let Those Digital Photos Out! (You Don’t Have To Print Them Yourself)
by Liz Beresford

Outlook... Not Just for Email! Using Your Outlook Contacts
by Janet Barclay

How To Manage Your Username And Password The Easy And Secure Way
by Jerry Yu

Signs Of A Reliable Web Site
by Jonathon Hardcastle

What's New At Google Labs?
by Jim Edwards

Which Spam Filter Is Best For You?
by Niall Roche

Internet Faxing: Simple Way To Send Your Faxes Online
by Claudia Walters

Complete Overview of Linux
by Matthew Gebhardt

How To Recognize A Phishing Email Message
by Colleen Durkin

Trojan Horse Delivered In Automatic Update
by Darren Miller

<< Back to "Computers And Internet" Index