windows 7

Windows 7 is upon us. Microsoft's new operating system has finally made it to retail stores across the country, bringing with it an improved user interface, layers of much-needed polish over Windows Vista, DirectX 11, and even a fresh version of the Windows Calculator. Also, that old Windows 3.1-era font dialog Microsoft inexplicably kept around forever is gone. You can see why everyone is excited .
It's about time for a new system guide, wouldn't you say? We're happy to oblige. Keep reading to see how we've updated our four builds with DirectX 11 goodness, not to mention our tips for outfitting your new PC with the right version of Windows 7.
The first thing you should know about this guide is that it's geared toward helping you select the parts for a home-built PC. If you're new to building your own systems and want a little extra help, our tutorial on how to build your own PC is a great place to start and a helpful complement to this guide.

Before tackling our recommended systems, we should explain some of the rules and guidelines we used to select components. The guiding philosophy behind our choices was to seek the best bang for the buck. That means we avoided recommending super-cheap parts that are barely capable of performing their jobs, just as we generally avoided breathtakingly expensive products that carry a hefty price premium for features or performance you probably don't need. Instead, we looked to that mythical "sweet spot" where price and performance meet up in a pleasant, harmonic convergence. We also sought balance within each system configuration, choosing components that make sense together, so that a fast processor won't be bottlenecked by a skimpy graphics card or too little system memory, for instance. The end result, we hope, is a series of balanced systems that offer decent performance as configured and provide ample room for future expandability.

We confined our selections to components that are currently available online. Paper launches and preorders don't count, for obvious reasons. We also tried to stick to $500, $800 and $1200 budgets for our three cheapest desktop systems. Those budgets are loose guidelines rather than hard limits, to allow us some wiggle room for deals that may stretch the budget a little but are too good to resist.

We've continued our tradition of basing the guide's component prices on listings at Newegg. We've found that sourcing prices from one large reseller allows us to maintain a more realistic sense of street prices than price search engine listings, which are sometimes artificially low. In the few cases where Newegg doesn't have an item in stock, we'll fall back to our trusty price search engine rather than limit our options.

Finally, price wasn't the top factor in our component choices. Our own experiences with individual components weighed heavily on our decisions, and we've provided links to our own reviews of many of the products we're recommending. We've also tried to confine our selections to name-brand rather than generic products—and to manufacturers with solid reputations for reliability. Warranty coverage was an important consideration, as well.

computer center

The Computer Centre was set up in 1967 as an attached office of the Department of Statistics with three second generation Computer Systems to cater to the data processing needs of not only the Department of statistics but other Ministry/Departments of the Union government as well. From its very inception, the Centre has effectively performed the pioneering task of building up data processing capabilities in many organisations of the Government of India and played a vital role in imparting intensive training in systems analysis and design, software including programming languages, data processing and consultancies. Till recently Centre's major responsibility related to data preparation and processing of data collected through various socio-economic surveys, Economic Censuses, enterprise surveys, price data and the Annual Survey of Industries conducted by the NSSO and the CSO. However, it has now been diversified. At present the centre possesses a large number of PCs and a group of highly skilled and experienced EDP professionals.

Data Preparation

Data preparation is the integral part of data processing. Besides processing of data made available to it on magnetic tapes as inputs, the Computer Centre also had the system of data preparation by keying in the data through data entry machines. For data entry from the schedules/ questionnaires, the Centre has been using PC-based electronic machines since 1994. The Centre is currently responsible for data preparation (i.e. keying in the data through data entry machines) relating to:

o Consumer Price Index for Urban Non-Manual Employees CPI(UNME) – monthly returns.

o National Sample Survey Socio-Economic Survey Listing schedules (0.1 and 0.2) from the 51st round onward.

Data Processing :

Over the years, the Computer Centre has undertaken computing jobs for processing of voluminous data varying widely in structure and content. It has shouldered the responsibility of processing and tabulation of NSS data (and been partly involved in the data cleaning effort) right from the 27th round (1972-73) onwards to the 50th round (1993-94) and the ASI data since 1974-75 upto ASI 1994-95. Tabulation based on past data is also taken up to meet the specific requirements of Planning Commission and other users from time to time. The entire software for processing and tabulation of data of the Economic Censuses (EC) of 1977, 1980, 1990 and 1998 was developed at the centre and provided to the States for processing and tabulating State-level data. Besides giving technical guidance to States, the Computer Centre did the data entry and processing of the state EC-98 data of Meghalaya and Orissa states. Generation of all-India tables based on State level tabulations was also done at the centre. The data of different enterprise surveys of the CSO are also processed at the centre. The Center is also processing and producing price index for urban non-manual employees: CPI(UNME) on a regular basis. The latest data processing project completed by the centre was on Time Use Survey conducted by Central Statistical Organisation in close coordination with six participating states namely Haryana, Gujarat, Orissa, Madhya Pradesh, Tamil Nadu and Meghalaya.

Data Preservation and Dissemination:

As per the "National policy on dissemination of statistical data", Computer Centre has preserved a large volume of data generated through various socio-economic surveys, Enterprise Surveys, Economic Censuses, Annual Survey of Industries and price data on CD-ROMs. These data are being disseminated regularly to a large number of national and international users. Technical guidance for the use of basic data and their processing is also provided to the users both within and outside the country on request.

Training Activity:

Training is one of the main activities of the Centre. It has been conducting EDP courses for various States and Central government departments and international agencies. Over a period of time, the centre has trained a large number of officers in electronic data processing. Besides, it has conducted six "Programmer" level courses and two "Training of Trainers" level courses in EDP for UN-sponsored candidates from the ESCAP region countries under the United Nations Household Survey Capability Programme between 1983 and 1991. EDP training to ISS Probationers was also conducted upto 1995 on a continuous basis. Currently the centre is conducting Information Technology (IT) courses, Software packages for middle level ISS officers and in-house training courses. Training on specific modules on Oracle & Developer 2000, C++, Visual Basic, Visual FoxPro and Internet, Web-designing & Networking has also been organised regularly.

Web-site :

The Web-site of the Ministry of Statistics and Programme Implementation has been indigenously designed and is being maintained by the Computer Centre. The Press Releases are uploaded on the same day. Other materials are also uploaded in the shortest possible time after necessary editing and preparation. To provide better access to the users, the Centre is planning to make it a dynamic web-site. It is also planned to make the entire web-site bi-lingual as per the PMO’s guidelines. Computer Centre has completed a project on designing of exclusive web-pages including computerization of activities of Annual Survey of Industries, for NSSO(FOD).

The web site has recently been redesigned. Besides making it more attractive, Hit counter, icons for Search Engine, Suggestions from users, Registration form for downloading repots by the users etc have been added. The objective is to make the web-site, to the extent possible, menu driven. The address of the website is : http://mospi.gov.in

Data Warehouse on Official Statistics :

In the light of advances in the field of Information Technology, the statistical data can now play a very vital role in planning for development including industrialization of any country. In view of this, the important key statistical data pertaining to all major sectors of economy must be available simultaneously through internet to all the users who are associated with the planning and development and also to researchers. For this, the Centre has taken up the project of creation and maintenance of National Data Warehouse of Official Statistics from the data generated through sample surveys, censuses, enquiries etc. Under this project, the Centre will preserve data generated by various Central Ministries and Departments and Public Sector Undertakings on electronic media, organize the data in the form of databases, create data Warehouse and provide remote access facilities to end-users through a network. However, it is an evolving and new concept and will take some time before it becomes fully functional and usable. Computer Centre has made a beginning by starting preparation of databases, which is an essential step for creation of data Warehouse. A Direction committee under the chairmanship of DG&CEO, NSSO with members from IIT- Delhi, IASRI-Delhi, RBI, NIC besides senior officers of the Ministry has been constituted to advise the Computer Centre in all aspects of setting up of Data-warehouse including determination of appropriate configuration of hardware, selection of software, consultant and training of officers of the Computer Centre. On the advice of the Direction Committee, the Computer Centre prepared an approach paper giving broad outlines of the project. Further, Australian Bureau of Statistics and Statistics Canada, who have already set up Data warehouse in their countries, have been approached to share their experience, architecture and approach followed in creation of Data Warehouse.

While considering the approach paper, the Direction Committee in its meeting held in November, 2002 decided that there was an imperative need for appointment of a consultant to assist the Computer Centre in this project. The consultant besides system analysis, system designing, selection of software, selection of hardware and development of application software, would do a pilot project jointly with the officers of the Computer Centre on live databases of at least of two sectors for development of Data Warehouse. The consultant would also suggest/organize appropriate training for the officers of the Computer Centre. It has been recommended by the Direction Committee that COG NOS Dataware house tools will be used for the project. It is expected that the pilot project on creation of Data Warehouse on Consumer Expenditure and Employment & Unemployment would be completed by the end of the current financial year.


2012

2012 is not a disaster. That's one thing critics, for the most part, agree on -- to various degrees. Roger Ebert in the Chicago Sun-Times even goes so far as to call it "the mother of all disaster movies" -- largely because the movie doesn't merely show a few recognizable landmarks being destroyed -- but the entire Earth. "You think you've seen end-of-the-world movies?" he remarks. "This one ends the world, stomps on it, grinds it up and spits it out." His conclusion: "The movie gives you your money's worth. Is it a masterpiece? No. Is it one of the year's best? No. Does Emmerich hammer it together with his elbows from parts obtained from the Used Disaster Movie Store? Yes. But is it about as good as a movie in this genre can be? Yes." Many reviewers note that it's a useless enterprise to try to critique the screenplay -- which is based on the premise that ancient Mayans forecast the end of the world on December 21, 2012 -- the final day of their calendar. (They apparently did not forecast the end of their own civilization, which occurred hundreds of years earlier.) That hasn't stopped others from zeroing in on the plot. Like Manohla Dargis in the New York Times, who comments, "Despite the frenetic action scenes, the movie sags, done in by multiple story lines that undercut one another," she writes. Claudia Puig in USA Today sums up: "The movie is an undeniable visual spectacle, but just as unequivocally a cheesy, ridiculous story." Lou Lumenick in the New York Post won't even grant that it's cheesy, calling it instead "pure Velveeta," -- but, ah, the spectacle. "About the only thing that's missing from 2012 (except sanity)," he writes, "is 3-D, IMAX and Sensurround. For those, I would gladly pay $20 a ticket." Noting that the movie reportedly cost $260 million to make, Elizabeth Weitzman writes in the New York Daily News: "All that money can buy some jaw-dropping special effects, but not, it seems, a script worth a dime." Still, Tom Maurstad in the Dallas Morning News thinks it was probably a good idea to present a threadbare story. "If the viewer were ever invited to think or feel about what's happening on-screen, the movie's wow-whoa-ain't-it-cool momentum would collapse in a heap of horrific preposterousness," he writes. And Mick Lasalle in the San Francisco Chronicle gives it a rave review, although admitting, "It's hard to do justice to his ridiculous, wonderful movie." Lasalle makes the point: "People talk about 'formula' almost always as a pejorative, but formulas get to be formulas because they work, and there's something to be said for a formula picture done almost to perfection." On the other hand, Joe Morgenstern in the Wall Street Journal hasn't a kind word to say about either the story or the effects, tagging the movie, "destructo drek." »

Database Design and Modeling Fundamentals

Database design and the creation of an entity relationship diagram (also known as an "ERD" or data model) is an important yet sometimes overlooked part of the application development lifecycle. An accurate and up-to-date data model can serve as an important reference tool for DBAs, developers, and other members of a JAD (joint application development) team. The process of creating a data model helps the team uncover additional questions to ask of end users. Effective database design also allows the team to develop applications that perform well from the beginning. By building quality into the project, the team reduces the overall time it takes to complete the project, which in turn reduces project development costs. The central theme behind database design is to "measure twice, cut once".

Effective database designers will keep in mind the principles of normalization while they design a database. Normalization is a database design approach that seeks the following four objectives:
  1. minimization of data redundancy,
  2. minimization of data restructuring,
  3. minimization of I/O by reduction of transaction sizes, and
  4. enforcement of referential integrity.
The following concepts and techniques are important to keep in mind when designing an effective database:
  1. An entity is a logical collection of things that are relevant to your database. The physical counterpart of an entity is a database table. Name your entities in singular form and in ALL CAPS. For example, an entity that contains data about your company's employees would be named EMPLOYEE.

  2. An attribute is a descriptive or quantitative characteristic of an entity. The physical counterpart of an attribute is a database column (or field). Name your attributes in singular form with either Initial Capital Letters or in all lower case. For example, some attribute names for your EMPLOYEE entity might be: EmployeeId (or employee_id) and BirthDate (or birthdate).

  3. A primary key is an attribute (or combination of attributes) that uniquely identify each instance of an entity. A primary key cannot be null and the value assigned to a primary key should not change over time. A primary key also needs to be efficient. For example, a primary key that is associated with an INTEGER datatype will be more efficient than one that is associated with a CHAR datatype. Primary keys should also be non-intelligent; that is, their values should be assigned arbitrarily without any hidden meaning. Sometimes none of the attributes of an entity are sufficient to meet the criteria of an effective primary key. In this case the database designer is best served by creating an "artificial" primary key.

  4. A relationship is a logical link between two entities. A relationship represents a business rule and can be expressed as a verb phrase. Most relationships between entities are of the "one-to-many" type in which one instance of the parent entity relates to many instances of the child entity. For example, the relationship between EMPLOYEE and STORE_LOCATION would be represented as: one STORE_LOCATION (parent entity) employs many EMPLOYEEs (child entity).

  5. The second type of relationship is the "many-to-many" relationship. In a "many-to-many" relationship, many instances of one entity relate to many instances of the other entity. "Many-to-many" relationships need to be resolved in order to avoid data redundancy. "Many-to-many" relationships may be resolved by creating an intermediate entity known as a cross-reference (or XREF) entity. The XREF entity is made up of the primary keys from both of the two original entities. Both of the two original entities become parent entities of the XREF entity. Thus, the "many-to-many" relationship becomes resolved as two "one-to-many" relationships. For example, the "many-to-many" relationship of (many) EMPLOYEEs are assigned (many) TASKs can be resolved by creating a new entity named EMPLOYEE_TASK. This resolves the "many-to-many" relationship by creating two separate "one-to-many" relationships. The two "one-to-many" relationships are EMPLOYEE (parent entity) is assigned EMPLOYEE_TASK (child entity) and TASK (parent entity) is assigned to EMPLOYEE_TASK (child entity).

  6. A "foreign key" exists when the primary key of a parent entity exists in a child entity. A foreign key requires that values must be present in the parent entity before like values may be inserted in the child entity. The concept of maintaining foreign keys is known as "referential integrity".

  7. Relationships between two entities may be classified as being either "identifying" or "non-identifying". Identifying relationships exist when the primary key of the parent entity is included in the primary key of the child entity. On the other hand, a non-identifying relationship exists when the primary key of the parent entity is included in the child entity but not as part of the child entity's primary key. In addition, non-identifying relationships may be further classified as being either "mandatory" or "non-mandatory". A mandatory non-identifying relationship exists when the value in the child table cannot be null. On the other hand, a non-mandatory non-identifying relationship exists when the value in the child table can be null.

  8. Cardinality helps us further understand the nature of the relationship between the child entity and the parent entity. The cardinality of a relationship may be determined by asking the following question: "How many instances of the child entity relate to each instance of the parent entity?". There are four types of cardinality: (1.) One to zero or more (common cardinality), (2.) One to one or more (P cardinality), (3.) One to zero or one (Z cardinality), and (4.) One to exactly N (N cardinality).
In conclusion, effective database design can help the development team reduce overall development time and costs. Undertaking the process of database design and creating a data model helps the team better understand the user's requirements and thus enables them to build a system that is more reflective of the user's requirements and business rules. The act of performing database design is platform-independent so persons who use database systems other than SQL Server should also be able to benefit from these concepts.

all about IT

Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and securely retrieve information.

Today, the term information has ballooned to encompass many aspects of computing and technology, and the term has become very recognizable. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems.

When computer and communications technologies are combined, the result is information technology, or "infotech". Information technology is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated.

In recent days ABET and the ACM have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study separate from both Computer Science and Information Systems. SIGITE is the ACM working group for defining these standards.