Computer Software Engineer: Career Outlook for Computer Software Engineering Professions

Computer Applications Software Engineer

Businesses, organizations and institutions often have specialized data needs. Computer application software engineers develop customized application programs based on these needs. Application software engineers analyze an organization's user needs, develop appropriate software programs and build computer networks that support these applications. Most computer application software engineers have bachelor's degrees in software engineering, computer programming, information technology or computer science.

Computer Systems Software Engineer

Computer systems software engineers aid businesses, institutions and organizations with the expansion and maintenance of existing computer networks. Technologies change rapidly, and these professionals continually adapt to user needs and update existing computer infrastructure accordingly. Computer systems software engineers typically hold a bachelor's degree in network administration, computer science, computer software engineering or information technology. They often work within a larger information technology department and must be excellent problem solvers.

Computer Software Programmer

Software programs are conceptualized, created, tested and then implemented by computer software programmers. These professionals use various computer programming codes and languages to create software that is user-friendly and efficient. Many businesses use standard, out-of-the-box software programs. However, some need customized applications. Software programmers create these individualized programs and monitor their efficiency as needs change. Computer software programmers often work with network engineers and information technology directors and hold bachelor's degrees in computer programming, software development or software engineering.

Apple Aperture 2 Photo Editing and Management Software



CUPERTINO, California—February 12, 2008 - Apple today introduced Aperture 2, the next major release of its groundbreaking photo editing and management software with over 100 new features that make it faster, easier to use and more powerful. With a streamlined user interface and entirely new image processing engine, Aperture 2 also introduces new imaging tools for highlight recovery, color vibrancy, local contrast definition, soft-edged retouching, vignetting and RAW fine-tuning, and lets users directly post their portfolios on the .Mac Web Gallery* for viewing on the web, iPhone, iPod touch and Apple TV. At a new low price of $199, anyone can easily organize, edit and publish photos like a pro.
“Many of the most respected photographers on assignment all over the world trust Aperture to organize, edit and deliver their images,” said Rob Schoeben, Apple’s vice president of Applications Product Marketing. “With its simpler interface and lower price, anyone can take full advantage of Aperture’s power.”
“At the end of the day, it’s all about the quality of the image,” said Sports Illustrated contributing photographer David Bergman. “Even before I begin making adjustments, Aperture’s new RAW processing gives me better images with more visible detail and better color rendering than any other program I’ve tested.”
“I used to have so much stress about post-production on a shoot, having to juggle multiple applications to make sure they all worked,” said Bob Davis, PDN Top Knots Wedding Photographer 2007. “With Aperture that’s no longer a factor. I can do everything all in one application.”
Featuring a new, easier user interface designed to be more intuitive and accessible, Aperture 2 now lets users navigate between Viewer and Browser modes with a single key command. Screen real estate is maximized for images with an all-in-one heads up display that allows users to toggle between library, metadata and adjustment controls in a single tabbed inspector. The All Projects view, modeled after iPhoto’s Events view, provides a poster photo for every project and the ability to quickly skim through the photos inside, and the integrated iPhoto Browser offers direct access to all the events and images in the iPhoto library.
Performance has been enhanced in Aperture 2 so it’s faster to import, browse and search large volumes of images. Embedded previews let photographers caption, keyword and rate images as they are being imported, and with the ability to export images in the background, photographers can continue working while images are processed to JPEG, TIFF, PNG and PSD file formats. Quick Preview allows users to browse RAW images in rapid succession without having to wait for files to load, and the Aperture library database has been re-architected to provide fast project switching and near instantaneous search results, even when working with extremely large libraries of 500,000 images or more.
Aperture 2 delivers powerful new imaging tools for getting the most out of each photograph. Apple’s next-generation RAW image processing is at the core of Aperture 2 offering uncompromising image quality and precision controls that let users fine-tune the image profile for each of their cameras. New tools for improving and enhancing images include Recovery for pulling back “blown” highlights, Vibrancy for selectively boosting saturation without adversely affecting skin tones, Definition, which offers local contrast for adding clarity to images, Vignette & Devignette filters for providing professional visual effects and a true soft-edged Repair and Retouch brush for quickly and easily removing blemishes, cleaning up sensor dust and cloning away problem areas.
Aperture 2 works seamlessly with Mac OS X, iLife, iWork, .Mac and Apple print products, so any image in the Aperture library can be accessed directly from within other applications, such as iMovie, Keynote and Pages, and even from within Leopard Mail. Now with .Mac Web Gallery support, Aperture users can publish their photos once to view them on the web, iPhone, iPod touch and Apple TV. Books in Aperture 2 feature new theme designs, layout tools, customized dust jackets (including full-bleed) and foil stamped covers.
Pricing & Availability
Aperture 2 is available immediately for a suggested retail price of $199 (US) through the Apple Store , Apple’s retail stores and Apple Authorized Resellers. Owners of previous versions of Aperture can upgrade to Aperture 2 for just $99 (US).

* The .Mac service is available to persons aged 13 and older. Annual membership fee and Internet access required. Terms and conditions apply.
Apple ignited the personal computer revolution in the 1970s with the Apple II and reinvented the personal computer in the 1980s with the Macintosh. Today, Apple continues to lead the industry in innovation with its award-winning computers, OS X operating system and iLife and professional applications. Apple is also spearheading the digital media revolution with its iPod portable music and video players and iTunes online store, and has entered the mobile phone market with its revolutionary iPhone.

How to Create Accounting Software In The Shortest Time?

Accounting software is very critical to every business environment. Almost every famous business software already has its' solution to integrate with other popular accounting systems. There are 4 types of accounting software that can be use for such integration:

Type #1 - Integration Through Software Development Kit provided by vendor.
Business application vendor needs SDK from the accounting software provider, and they must not in any direct competition, or the vendor may stop to provide further support and authorization to use their system, and every time the customer need the accounting module, the business application provider will have to pay for the accounting software license.

Type #2 - Integration Through import and export.
Majority accounting software vendor use this method, and proves to be the most standard and easiest way to integrate an accounting system. However, lack of customization flexibility will leave the application vendor highly dependable to their accounting vendor to do the customization work for them.

Type #3 - Open Source Accounting system.
Many people loved this, but commercial environment will require more support form the open source community. One of the biggest drawbacks is application provider must include their source code or the modified version of the accounting source code during distribution, hence lost competitive advantages to their competitor.

Type #4 - Commercial Royalty Free Accounting Source Code.
This is the best alternative currently available beside creating your own accounting system! Just pay a minimum one time charges, and you will have the royalty free source code to modify, enhance and integrate! Plus you will be supported by the source code vendor.

To find a good accounting source code vendor, there are 2 criteria to be met:

Criteria #1 - Programming method.
If the source code does not utilize advance programming method such as Object-Oriented Programming (OOP) and Rapid Application Development (RAD), developer might need more time to change and enhance the system in the future.

Criteria #2 - Unleash compiler advantages.
The source code must always follow the compiler trend, maximizing the technology effort instead of traditional coding that takes lots of time to integrate.

Utilizing available resources will greatly shorten the development time and increase their ability to deliver accounting software for their application faster to the market!

Evidence Management and Computer Technology

Evidence management poses the greatest routine stress of law enforcement related activities. Missing evidence can stir the foundation of any agency along with potential career loss resulting from criminal or substandard management.
Computerizing evidence storage complements the paper trail that only adds work to an already heavily tasked property technician.
Bar code technology was adopted to allow for fast and accurate entry of articles. With scanning devices and commands attached to identify and document the direct flow of evidence, personnel had greater flexibility in setting up programs enabling query capabilities by name, case number, location, status, and descriptors to mention a few. Bar code systems greatly aid in the inventory function by increasing the speed of operations, saving budget funds and lowering liability.
A well planned records system is an essential element in order to operate an effective property control system.
Seven objectives top the list of priorities and goals to keep in mind when designing or modifying a property system.
  • Document responsibilities;
  • Establish continuous custody records;
  • Prevent loss or unauthorized release of evidence;
  • Document accurate descriptions of each piece of evidence and location;
  • Document unique or unusual circumstances regarding release or transfer;
  • Record date, purpose, and signature(s) of individuals checking property out;
  • Document destruction, auction or any other movement of property.
Major deficiencies in many evidence systems are the lack of written procedures, lack of comprehensive inventory report forms, and inadequate quality control.
Computerized evidence and property control systems can generate accurate and timely management of data. Computers can track statistical data in various methods to provide justification for personnel and equipment needs during budget preparation.
There are few programs on the market that are designed to really fit the needs of law enforcement or evidence manager’s needs. Most agencies are faced with just getting by with what is out there or attempting to modify their current system to meet their standards of operation. Property personnel have been led to believe that bar coding is computerization when in essence it is an additional tool that may be considered after or in conjunction with computerization.
The initial information that is required to be entered into the system manually has created many hours being spent at the keyboard. Some agencies are spending countless hours of time duplicating these efforts. Access any and all alternatives available to aid in this time saving effort. Research alternative methods that may allow booking officers the ability to type report information into a portable computer with downloading capabilities or look into the possibility of scanning reports into the system to save time.
Most evidence rooms may not be faced with difficulty in handling the daily intake. Their main concerns may lie with case dispositions and obtaining information from courts or detectives in a timely fashion. The majority of property that is stored never gets checked out for court, analysis, investigative review, etc. but we are required to hold onto these items until a final disposition has been reached. We are scrambling to look for better ways to approach this problem and computers may be the link that can tie us all together from a State level.
The main conclusion is to build a solid foundation of formal procedures, adequate training, continuing supervisory control, and efficient property control record system. Whether you are operating in a manual paper tracking system or a computerized tracking system this foundation is the key element to properly run and manage the property evidence function.

Career in Computer Programming

As long as technology continues to develop, the demand for trained and skilled professionals in the IT sector will continue to thrive. Businesses and industries are always searching for professional programmers in departments like administration, security, and management. The growth of the Internet has seen a rise in wireless operations, networks, and client/server developments. With technology rapidly changing, the need for computer programming jobs is expected to increase to fulfill the growing demand.
As a programmer, your responsibilities evolve continuously. When choosing a computer programming degree, look for a course that is up to date with latest developments, advancements and equipment. In order to succeed, programmers need to constantly brush up their skills and knowledge in the field. Job applicants can also enhance their chances in the competitive job market by becoming certified in various languages. Many computer programming degrees also train their students for relevant certifications.
So, what exactly do computer programmers do? In short, they develop the instructions and languages that computers need to function smoothly. These functions could range from a short program to a lengthy process that could take a few years to create and implement. Programmers are also responsible for testing systems for errors and resolving issues and problems that may crop up. This process involves the use of complex technological codes or languages. Most programmers know and specialize in a variety of languages. Programmers may also be called to write manuals and instructions for other program users of a particular system or mainframe.
Many computer programmers work on a contractual basis or independently as consultants. Companies that require professionals specializing in a particular language or application may then outsource the job to computer programming consultants. Contracts could run into a few weeks to more than a year so commitment levels on such jobs are extremely demanding and high.
Programmers can be categorized into applications programmers or systems programmers. Applications programmers are those that create and modify programs for a specific purpose or cause. Systems programmers on the other hand work on a larger scale and deal with developing computer networks and operating systems. They are responsible for the effective functioning of computer hardware as well. With the rise in software packages, a new breed of software development programmers has emerged. They work with other programmers in order to create customized or packaged software such as games and other programs used for financial management and educational purposes.
Computer programmers account for nearly one and half million jobs and the numbers continue to grow. Computer programmers are required in telecommunications, management, education, the government and finance, to name a few. If a degree in computer programming is what you think you want to do, you can be sure of one thing - there's a bright and lucrative future for such candidates.
Stevens-Henager College was established in 1891 and is distinguished as one of the oldest colleges in Utah offering degree programs, both on-campus and online for Master's, Bachelor's, and Associate's Degrees. Working professionals can enhance their career and qualifications with the in-demand online degree programs offered by Stevens-Henager College.

about joomla

Built-in applications are a-plenty with this system. The blog function is quite comprehensive. Between the built-ins and the plugins Joomla! can be used to create corporate sites or portals, corporate intranets and extranets, online zines, newspapers and other publications, e-commerce and online reservations, small business sites, non-profit sites, school and church Web sites and even personal or family sites.
Joomla! applications will keep track of every piece of content on your site from text and photos to music and video.
Choose to manage menus, content, components, extensions or tools from the admin navigation bar. The text editor is self-explanatory.
Within the content tab, we choose “new article” and tried to input some text to use for the front page. It won’t let you publish or save this information unless you have first created a section(s) and category(ies) in which the content can live. It only took a few seconds to navigate back to the section and category managers in which we created, named and saved those areas.
Next we went back to the content tab and selected the article function, at which point we successfully added some text for the front page.
If you want to just blog with this program, name a section and category, then when you create an article select “front page” in the editor and your blog entries will appear in standard blog format.
Each post or article can be optimized inside the editor as well – you can enter meta data and keywords for each post.
Joomla! does not submit to blog directories for you so if you use Joomla you will have to register with the aggregator and/or directory services of your choice so your entries will be submitted to the search engines.
In this system, an Article is some written information that you want to display on your site. It normally contains some text and can contain pictures and other types of content. Sections and categories in Joomla! provide an optional method for organizing your articles. Here's how it works. A Section contains one or more categories, and each Category can have articles assigned to it. One Article can only be in one Category and Section.

Top 5 Computer Viruses

They can wipe out your hard drive.  They can crash servers and send shock waves around the cyber world.  And they can embarrass you.  Here are the biggest computer viruses in cyber history, so far:

1.    Brain (1986)

This is the one that started it all.  Created by two Pakistani brothers, the Brain virus is widely considered to be the first PC computer virus ever.  It wasn't particularly virulent.  It was a boot sector infector, and was accompanied by the creators' names and phone numbers.   The brothers swore the intention was not malicious, but the genie was out of the bottle; the Brain virus became the model for numerous viruses and other malware for years to come.

2.    Melissa (1999)

The Melissa virus was one of the first viruses to be spread via e-mail, crashing corporate networks by taking advantage of the increasingly popular Microsoft Outlook mail client.  The virus was spread by an infected attachment to the email, and automatically sent itself to the first 50 names in the user's address book, making it the first virus to move from one computer to another on its own.  The amount of email passing through servers quickly increased exponentially, forcing shutdowns and business stoppages.  The Melissa virus was also noteworthy because its creator was caught and sent to prison.

3.    ILOVEYOU (2000)

This is the virus that made us all afraid to open attachments. ILOVEYOU was one of the first to trick users into opening the file that actually activated the virus.  The ILOVEYOU worm masqueraded as a love letter sent via e-mail, but instead was a computer script that sent copies of itself to users' Microsoft Outlook address book entries, overwrote and deleted computer files, modified Internet Explorer pages, and even interfered with Registry keys.  It caused billions of dollars worth of damages, and is still considered one of the worst worms ever.

4.    Klez (2001)

Yet another e-mail virus, Klez makes the list because it pioneered "spoofing," the trick of making it seem as if an email comes from someone other than the actual sender.  Thanks to Klez, you can't trust that an email that appears to be from your mom is actually from your mom; a spammer may have "spoofed" the From field.  Klez spreads itself using open networks and e-mail.  Unlike other viruses, it doesn't require Microsoft Outlook to spread itself - and while it isn't a particularly malicious virus, it can corrupt files and interfere with some software.  Klez is also notoriously hard to exterminate, and versions of it continue to turn up fairly regularly.

5.    Benjamin (2002)

Benjamin is significant in virus history because it's the first that spread itself via a file-sharing program, in this case, Kazaa.  It tricked users into thinking they were downloading media files.  Instead, they downloaded the Benjamin virus.  Benjamin created its own folder on the user's computer, filled the folder with replicated versions of itself, and then made itself available for sharing to other Kazaa users.  Benjamin's primary effects were to overload a user's hard drive and eventually to slow down networks, but aside from that it wasn't particularly malevolent.

BONUS:  THE ANNA KOURNIKOVA VIRUS (2001)

The Anna Kournikova virus didn't really cause any major harm or data loss, but it did cause embarrassment.  It was spread by users clicking on an attachment that promised to be a hot picture of the tennis star.  Naturally, opening the attachment unleashed the virus, which sent itself to all of the contacts in the user's Microsoft Outlook address book.  Server overload caused business stoppages, and numerous users were forced to admit that they found the promise of an Anna Kournikova image impossible to resist.

Top 10 Tips for Business Website Success


  1. Have (at least) an idea of what your budget is for your website development – initial and ongoing.
  2. Clearly define (to yourself) your expectations for your site. More sales? Cost savings? More enquiries? How much, and by when? Ensure that your site has bold 'calls to action' that will help deliver on these expectations.
  3. Get real. Ask prospective website developers if your expectations are realistic given your budget. Have broad shoulders, and be prepared to either reduce your expectations or increase your budget.
  4. The bigger and bolder your vision for your website, the higher the importance of planning in advance of a single line of code being written. Get a website planning expert to help you do this - either from within a website company, or an independent  website planning consultant (yes, they do exist). Expect to pay professional rates for a professional plan.
  5. Websites do not market themselves (honestly!). Do not allocate your entire budget to your website. Marketing of your site, online and/or offline (traditional), is essential *if* driving lots of prospective customers to your site is important. If it is, then seriously consider allocating at least 30% of your total first year budget on marketing. Some companies will allocate many multiples of their website budget for marketing activities.
  6. All other things being equal, you will get what you pay for. Larger companies will spend hundreds of thousands of dollars on their websites. Do not expect to get a site - or results - like theirs with a budget of a few thousand dollars. (see 'Get' Real'). 
  7. Choose a reputable website development company that's a good fit for your company.
    1. Proven track record. Hey, you might get lucky with a start-up, but remember that most new businesses don't make it past 3-5 years.
    2. A portfolio of website designs that you like. Good design really matters. You only get one chance to make first impression.
    3. Of a size that should deliver a consistent level of service to you. 1 or 2 person companies need holidays and get sick from time to time. Is that ok with you?
    4. Low staff turnover
    5. Appear genuinely interested in listening to your needs and expectations
    6. Aren’t afraid to give you advice you might not want to hear.
    7. Are offering business solutions that match your requirements – not pushing package ‘A’, ‘B’ or ‘C’.
    8. Perform the core work in-house. You will get better results if you can physically meet and talk with the people who actually do the work.
    9. Specialise in websites, as opposed to ‘also do websites’
    10. Can demonstrate that they understand what a 'fundamentally search engine friendly' website looks like.
  8. Don't neglect business fundamentals. The success/failure of your business is determined by many factors, including how great your website is. Look after the following:
    1. Cashflow
    2. Having the right people on board
    3. Your passion for your business
    4. Knowing your numbers
    5. Customer service
    6. Great systems
    7. Reliable supply channels
    8. Sufficient margins/markup
    9. Effective marketing
    10. Access to impartial business advice
  9. Understand that there is no such thing as a perfect or a finished website. Every truly successful website is under regular development/refinement. This is why it's so important to
    1. Choose a 'good fit' website company. Can you work with them on an ongoing basis? Do you share similar visions/values?
    2. Ensure that systems are in place, human and computer, to monitor and report on the performance of your website, and be prepared to tweak and tune your site over time to increase your success.
  10. Understand where your own expertise lies in the website development process. Let the experts in the company you've chosen do what they do best. Trusting them to do this shouldn't be an issue if you've chosen wisely.

Hardware Engineering

There's a joke circulating across the Internet that goes: "A software engineer, a hardware engineer, and a departmental manager were driving down a steep mountain road when suddenly the brakes on their car failed. The car careened out of control down the road, bouncing off the mountainside. The car's occupants, shaken but unhurt, discussed what to do next.

The departmental manager said, 'Let's have a meeting, propose a vision, formulate a mission statement, define a set of goals, and by a process of continuous improvement, find a solution to the critical problems, and then we can be on our way.'

'No, no,' said the hardware engineer, 'That will take far too long, and besides, that method has never worked before. I've got my Swiss Army knife with me, and in no time at all I can strip down the car's braking system, isolate the fault, fix it, and we can be on our way.'

'Well,' said the software engineer, 'Before we do anything, I think we should push the car back onto the road and see if it happens again.'"

While that joke may not be particularly flattering to engineers, it does shed some light on the nature of the hardware engineer's function. Generally, they are concerned with the design, development and testing or debugging of computer hardware-the nuts and bolts of a system. As the hardware engineer in the joke displayed, technical professionals in this capacity like to strip a problem down to the wires and fix it fast.

Employers can range from computer manufacturers to universities to any company that uses computer networks. When hiring hardware engineers, all look for qualities such as the ability to ensure hardware reliability, isolate problems quickly and improve the computer development process.

Sound like abilities you possess? If so, you might be interested in this dynamic field, which attracts technical professionals from electrical, systems and programming backgrounds. The day-to-day pace is varied and the challenges are numerous, but for the right candidates, the rewards are no joke.

windows 7

Windows 7 is upon us. Microsoft's new operating system has finally made it to retail stores across the country, bringing with it an improved user interface, layers of much-needed polish over Windows Vista, DirectX 11, and even a fresh version of the Windows Calculator. Also, that old Windows 3.1-era font dialog Microsoft inexplicably kept around forever is gone. You can see why everyone is excited .
It's about time for a new system guide, wouldn't you say? We're happy to oblige. Keep reading to see how we've updated our four builds with DirectX 11 goodness, not to mention our tips for outfitting your new PC with the right version of Windows 7.
The first thing you should know about this guide is that it's geared toward helping you select the parts for a home-built PC. If you're new to building your own systems and want a little extra help, our tutorial on how to build your own PC is a great place to start and a helpful complement to this guide.

Before tackling our recommended systems, we should explain some of the rules and guidelines we used to select components. The guiding philosophy behind our choices was to seek the best bang for the buck. That means we avoided recommending super-cheap parts that are barely capable of performing their jobs, just as we generally avoided breathtakingly expensive products that carry a hefty price premium for features or performance you probably don't need. Instead, we looked to that mythical "sweet spot" where price and performance meet up in a pleasant, harmonic convergence. We also sought balance within each system configuration, choosing components that make sense together, so that a fast processor won't be bottlenecked by a skimpy graphics card or too little system memory, for instance. The end result, we hope, is a series of balanced systems that offer decent performance as configured and provide ample room for future expandability.

We confined our selections to components that are currently available online. Paper launches and preorders don't count, for obvious reasons. We also tried to stick to $500, $800 and $1200 budgets for our three cheapest desktop systems. Those budgets are loose guidelines rather than hard limits, to allow us some wiggle room for deals that may stretch the budget a little but are too good to resist.

We've continued our tradition of basing the guide's component prices on listings at Newegg. We've found that sourcing prices from one large reseller allows us to maintain a more realistic sense of street prices than price search engine listings, which are sometimes artificially low. In the few cases where Newegg doesn't have an item in stock, we'll fall back to our trusty price search engine rather than limit our options.

Finally, price wasn't the top factor in our component choices. Our own experiences with individual components weighed heavily on our decisions, and we've provided links to our own reviews of many of the products we're recommending. We've also tried to confine our selections to name-brand rather than generic products—and to manufacturers with solid reputations for reliability. Warranty coverage was an important consideration, as well.

computer center

The Computer Centre was set up in 1967 as an attached office of the Department of Statistics with three second generation Computer Systems to cater to the data processing needs of not only the Department of statistics but other Ministry/Departments of the Union government as well. From its very inception, the Centre has effectively performed the pioneering task of building up data processing capabilities in many organisations of the Government of India and played a vital role in imparting intensive training in systems analysis and design, software including programming languages, data processing and consultancies. Till recently Centre's major responsibility related to data preparation and processing of data collected through various socio-economic surveys, Economic Censuses, enterprise surveys, price data and the Annual Survey of Industries conducted by the NSSO and the CSO. However, it has now been diversified. At present the centre possesses a large number of PCs and a group of highly skilled and experienced EDP professionals.

Data Preparation

Data preparation is the integral part of data processing. Besides processing of data made available to it on magnetic tapes as inputs, the Computer Centre also had the system of data preparation by keying in the data through data entry machines. For data entry from the schedules/ questionnaires, the Centre has been using PC-based electronic machines since 1994. The Centre is currently responsible for data preparation (i.e. keying in the data through data entry machines) relating to:

o Consumer Price Index for Urban Non-Manual Employees CPI(UNME) – monthly returns.

o National Sample Survey Socio-Economic Survey Listing schedules (0.1 and 0.2) from the 51st round onward.

Data Processing :

Over the years, the Computer Centre has undertaken computing jobs for processing of voluminous data varying widely in structure and content. It has shouldered the responsibility of processing and tabulation of NSS data (and been partly involved in the data cleaning effort) right from the 27th round (1972-73) onwards to the 50th round (1993-94) and the ASI data since 1974-75 upto ASI 1994-95. Tabulation based on past data is also taken up to meet the specific requirements of Planning Commission and other users from time to time. The entire software for processing and tabulation of data of the Economic Censuses (EC) of 1977, 1980, 1990 and 1998 was developed at the centre and provided to the States for processing and tabulating State-level data. Besides giving technical guidance to States, the Computer Centre did the data entry and processing of the state EC-98 data of Meghalaya and Orissa states. Generation of all-India tables based on State level tabulations was also done at the centre. The data of different enterprise surveys of the CSO are also processed at the centre. The Center is also processing and producing price index for urban non-manual employees: CPI(UNME) on a regular basis. The latest data processing project completed by the centre was on Time Use Survey conducted by Central Statistical Organisation in close coordination with six participating states namely Haryana, Gujarat, Orissa, Madhya Pradesh, Tamil Nadu and Meghalaya.

Data Preservation and Dissemination:

As per the "National policy on dissemination of statistical data", Computer Centre has preserved a large volume of data generated through various socio-economic surveys, Enterprise Surveys, Economic Censuses, Annual Survey of Industries and price data on CD-ROMs. These data are being disseminated regularly to a large number of national and international users. Technical guidance for the use of basic data and their processing is also provided to the users both within and outside the country on request.

Training Activity:

Training is one of the main activities of the Centre. It has been conducting EDP courses for various States and Central government departments and international agencies. Over a period of time, the centre has trained a large number of officers in electronic data processing. Besides, it has conducted six "Programmer" level courses and two "Training of Trainers" level courses in EDP for UN-sponsored candidates from the ESCAP region countries under the United Nations Household Survey Capability Programme between 1983 and 1991. EDP training to ISS Probationers was also conducted upto 1995 on a continuous basis. Currently the centre is conducting Information Technology (IT) courses, Software packages for middle level ISS officers and in-house training courses. Training on specific modules on Oracle & Developer 2000, C++, Visual Basic, Visual FoxPro and Internet, Web-designing & Networking has also been organised regularly.

Web-site :

The Web-site of the Ministry of Statistics and Programme Implementation has been indigenously designed and is being maintained by the Computer Centre. The Press Releases are uploaded on the same day. Other materials are also uploaded in the shortest possible time after necessary editing and preparation. To provide better access to the users, the Centre is planning to make it a dynamic web-site. It is also planned to make the entire web-site bi-lingual as per the PMO’s guidelines. Computer Centre has completed a project on designing of exclusive web-pages including computerization of activities of Annual Survey of Industries, for NSSO(FOD).

The web site has recently been redesigned. Besides making it more attractive, Hit counter, icons for Search Engine, Suggestions from users, Registration form for downloading repots by the users etc have been added. The objective is to make the web-site, to the extent possible, menu driven. The address of the website is : http://mospi.gov.in

Data Warehouse on Official Statistics :

In the light of advances in the field of Information Technology, the statistical data can now play a very vital role in planning for development including industrialization of any country. In view of this, the important key statistical data pertaining to all major sectors of economy must be available simultaneously through internet to all the users who are associated with the planning and development and also to researchers. For this, the Centre has taken up the project of creation and maintenance of National Data Warehouse of Official Statistics from the data generated through sample surveys, censuses, enquiries etc. Under this project, the Centre will preserve data generated by various Central Ministries and Departments and Public Sector Undertakings on electronic media, organize the data in the form of databases, create data Warehouse and provide remote access facilities to end-users through a network. However, it is an evolving and new concept and will take some time before it becomes fully functional and usable. Computer Centre has made a beginning by starting preparation of databases, which is an essential step for creation of data Warehouse. A Direction committee under the chairmanship of DG&CEO, NSSO with members from IIT- Delhi, IASRI-Delhi, RBI, NIC besides senior officers of the Ministry has been constituted to advise the Computer Centre in all aspects of setting up of Data-warehouse including determination of appropriate configuration of hardware, selection of software, consultant and training of officers of the Computer Centre. On the advice of the Direction Committee, the Computer Centre prepared an approach paper giving broad outlines of the project. Further, Australian Bureau of Statistics and Statistics Canada, who have already set up Data warehouse in their countries, have been approached to share their experience, architecture and approach followed in creation of Data Warehouse.

While considering the approach paper, the Direction Committee in its meeting held in November, 2002 decided that there was an imperative need for appointment of a consultant to assist the Computer Centre in this project. The consultant besides system analysis, system designing, selection of software, selection of hardware and development of application software, would do a pilot project jointly with the officers of the Computer Centre on live databases of at least of two sectors for development of Data Warehouse. The consultant would also suggest/organize appropriate training for the officers of the Computer Centre. It has been recommended by the Direction Committee that COG NOS Dataware house tools will be used for the project. It is expected that the pilot project on creation of Data Warehouse on Consumer Expenditure and Employment & Unemployment would be completed by the end of the current financial year.


2012

2012 is not a disaster. That's one thing critics, for the most part, agree on -- to various degrees. Roger Ebert in the Chicago Sun-Times even goes so far as to call it "the mother of all disaster movies" -- largely because the movie doesn't merely show a few recognizable landmarks being destroyed -- but the entire Earth. "You think you've seen end-of-the-world movies?" he remarks. "This one ends the world, stomps on it, grinds it up and spits it out." His conclusion: "The movie gives you your money's worth. Is it a masterpiece? No. Is it one of the year's best? No. Does Emmerich hammer it together with his elbows from parts obtained from the Used Disaster Movie Store? Yes. But is it about as good as a movie in this genre can be? Yes." Many reviewers note that it's a useless enterprise to try to critique the screenplay -- which is based on the premise that ancient Mayans forecast the end of the world on December 21, 2012 -- the final day of their calendar. (They apparently did not forecast the end of their own civilization, which occurred hundreds of years earlier.) That hasn't stopped others from zeroing in on the plot. Like Manohla Dargis in the New York Times, who comments, "Despite the frenetic action scenes, the movie sags, done in by multiple story lines that undercut one another," she writes. Claudia Puig in USA Today sums up: "The movie is an undeniable visual spectacle, but just as unequivocally a cheesy, ridiculous story." Lou Lumenick in the New York Post won't even grant that it's cheesy, calling it instead "pure Velveeta," -- but, ah, the spectacle. "About the only thing that's missing from 2012 (except sanity)," he writes, "is 3-D, IMAX and Sensurround. For those, I would gladly pay $20 a ticket." Noting that the movie reportedly cost $260 million to make, Elizabeth Weitzman writes in the New York Daily News: "All that money can buy some jaw-dropping special effects, but not, it seems, a script worth a dime." Still, Tom Maurstad in the Dallas Morning News thinks it was probably a good idea to present a threadbare story. "If the viewer were ever invited to think or feel about what's happening on-screen, the movie's wow-whoa-ain't-it-cool momentum would collapse in a heap of horrific preposterousness," he writes. And Mick Lasalle in the San Francisco Chronicle gives it a rave review, although admitting, "It's hard to do justice to his ridiculous, wonderful movie." Lasalle makes the point: "People talk about 'formula' almost always as a pejorative, but formulas get to be formulas because they work, and there's something to be said for a formula picture done almost to perfection." On the other hand, Joe Morgenstern in the Wall Street Journal hasn't a kind word to say about either the story or the effects, tagging the movie, "destructo drek." »

Database Design and Modeling Fundamentals

Database design and the creation of an entity relationship diagram (also known as an "ERD" or data model) is an important yet sometimes overlooked part of the application development lifecycle. An accurate and up-to-date data model can serve as an important reference tool for DBAs, developers, and other members of a JAD (joint application development) team. The process of creating a data model helps the team uncover additional questions to ask of end users. Effective database design also allows the team to develop applications that perform well from the beginning. By building quality into the project, the team reduces the overall time it takes to complete the project, which in turn reduces project development costs. The central theme behind database design is to "measure twice, cut once".

Effective database designers will keep in mind the principles of normalization while they design a database. Normalization is a database design approach that seeks the following four objectives:
  1. minimization of data redundancy,
  2. minimization of data restructuring,
  3. minimization of I/O by reduction of transaction sizes, and
  4. enforcement of referential integrity.
The following concepts and techniques are important to keep in mind when designing an effective database:
  1. An entity is a logical collection of things that are relevant to your database. The physical counterpart of an entity is a database table. Name your entities in singular form and in ALL CAPS. For example, an entity that contains data about your company's employees would be named EMPLOYEE.

  2. An attribute is a descriptive or quantitative characteristic of an entity. The physical counterpart of an attribute is a database column (or field). Name your attributes in singular form with either Initial Capital Letters or in all lower case. For example, some attribute names for your EMPLOYEE entity might be: EmployeeId (or employee_id) and BirthDate (or birthdate).

  3. A primary key is an attribute (or combination of attributes) that uniquely identify each instance of an entity. A primary key cannot be null and the value assigned to a primary key should not change over time. A primary key also needs to be efficient. For example, a primary key that is associated with an INTEGER datatype will be more efficient than one that is associated with a CHAR datatype. Primary keys should also be non-intelligent; that is, their values should be assigned arbitrarily without any hidden meaning. Sometimes none of the attributes of an entity are sufficient to meet the criteria of an effective primary key. In this case the database designer is best served by creating an "artificial" primary key.

  4. A relationship is a logical link between two entities. A relationship represents a business rule and can be expressed as a verb phrase. Most relationships between entities are of the "one-to-many" type in which one instance of the parent entity relates to many instances of the child entity. For example, the relationship between EMPLOYEE and STORE_LOCATION would be represented as: one STORE_LOCATION (parent entity) employs many EMPLOYEEs (child entity).

  5. The second type of relationship is the "many-to-many" relationship. In a "many-to-many" relationship, many instances of one entity relate to many instances of the other entity. "Many-to-many" relationships need to be resolved in order to avoid data redundancy. "Many-to-many" relationships may be resolved by creating an intermediate entity known as a cross-reference (or XREF) entity. The XREF entity is made up of the primary keys from both of the two original entities. Both of the two original entities become parent entities of the XREF entity. Thus, the "many-to-many" relationship becomes resolved as two "one-to-many" relationships. For example, the "many-to-many" relationship of (many) EMPLOYEEs are assigned (many) TASKs can be resolved by creating a new entity named EMPLOYEE_TASK. This resolves the "many-to-many" relationship by creating two separate "one-to-many" relationships. The two "one-to-many" relationships are EMPLOYEE (parent entity) is assigned EMPLOYEE_TASK (child entity) and TASK (parent entity) is assigned to EMPLOYEE_TASK (child entity).

  6. A "foreign key" exists when the primary key of a parent entity exists in a child entity. A foreign key requires that values must be present in the parent entity before like values may be inserted in the child entity. The concept of maintaining foreign keys is known as "referential integrity".

  7. Relationships between two entities may be classified as being either "identifying" or "non-identifying". Identifying relationships exist when the primary key of the parent entity is included in the primary key of the child entity. On the other hand, a non-identifying relationship exists when the primary key of the parent entity is included in the child entity but not as part of the child entity's primary key. In addition, non-identifying relationships may be further classified as being either "mandatory" or "non-mandatory". A mandatory non-identifying relationship exists when the value in the child table cannot be null. On the other hand, a non-mandatory non-identifying relationship exists when the value in the child table can be null.

  8. Cardinality helps us further understand the nature of the relationship between the child entity and the parent entity. The cardinality of a relationship may be determined by asking the following question: "How many instances of the child entity relate to each instance of the parent entity?". There are four types of cardinality: (1.) One to zero or more (common cardinality), (2.) One to one or more (P cardinality), (3.) One to zero or one (Z cardinality), and (4.) One to exactly N (N cardinality).
In conclusion, effective database design can help the development team reduce overall development time and costs. Undertaking the process of database design and creating a data model helps the team better understand the user's requirements and thus enables them to build a system that is more reflective of the user's requirements and business rules. The act of performing database design is platform-independent so persons who use database systems other than SQL Server should also be able to benefit from these concepts.

all about IT

Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and securely retrieve information.

Today, the term information has ballooned to encompass many aspects of computing and technology, and the term has become very recognizable. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems.

When computer and communications technologies are combined, the result is information technology, or "infotech". Information technology is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated.

In recent days ABET and the ACM have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study separate from both Computer Science and Information Systems. SIGITE is the ACM working group for defining these standards.