Thursday, March 31, 2011

Cubicle wars: Best and worst office setups for tech workers



Consider the modern office layout: Open floor plan, lots of common space flooded with natural light, clusters of "pods" with low partitions (or none), all designed to encourage teamwork, boost productivity and -- management hopes -- improve the bottom line.That type of office layout looks great on the company's Web site, and most likely the creative team loves it, but does IT? After all, many high-tech employees prefer to work in solitude, or at least in an environment quiet enough to foster intense concentration for significant chunks of time. Are these trendy open office layouts torture to the techie brain?Web designers and developers, project managers, system architects, even some software developers are embracing office layouts that encourage interaction. 

On the other hand, asking programmers or network administrators to do their jobs in an open space where noise, distractions and interruptions abound can be akin, for some of them at least, to departmental decimation.

IT managers at a range of companies, from giants like Google to small consultancies, to get a sense of which office layouts are better for which types of high-tech workers -- and which, emphatically, are not. Here's what we found about IT's likes and dislikes and why office layout is not a decision to make lightly.
open vs. office, the eternal debate

The IT profession attracts people who multitask in the extreme, declares a tech manager who oversees a staff of 15 at a U.S.-based grocery chain. These types of workers need some privacy to stay on task without interruptions sending them off in even more new directions, she says.
"Most people I [manage] are high-functioning multitaskers who can't stand to sit still; they're always doing something. They want offices so they're not disturbing others," says the manager, who asked not to be named.
She has twelve years of experience in management, and she says that every IT worker she has managed has jumped at the chance to move from a cubicle to an office when given the opportunity. And yet these workers still want their offices to be located close together, so they can easily bounce ideas off people who understand what they're talking about. "When they have a problem, they can quickly explain it to someone to get an answer," she says. "But they also like to be able to withdraw." 
 Walter says he believes Ally strikes a good balance between open and private. It offers IT workers a large open area filled with aisles of cubes divided by low partitions, with a variety of conference rooms -- ranging from ones large enough to fit about 30 people to small ones that accommodate just three or four -- scattered around the edges.

Office Space Planners cubicleWhat's really best for IT?

The current trend in office design is to offer open space that fosters collaboration while still offering some workstation definition -- that is, some kind of physical boundary between one worker and the next.
For example, opening out traditional L-shaped cubes, where the two legs of the L form a 90-degree angle, to a configuration where the two sides form a 120-degree angle, increases workers' space while still giving them some boundaries.Whereas another employer says,the best setting is a cube with high walls located in the same area as other people who are working on the same project, regardless of job function.That way he can have the privacy to focus on his work, but also be able to stand up, take a few steps and ask questions of his co-workers when needed. 


Thursday, March 24, 2011

Six Mistakes Companies are Making today

Is your Company Making the same mistakes?
Take the time to find out.
Mistake 1: Taking Existing Customers for Granted
When demand slows, the last thing you want to do is to take your customerbase for granted by assuming existing customers will remain loyal through tough economic times.Instead, customers may be faced with their own financial difficulties or could be lured away by incentives from the competition. Staying in tune with the needs of your customers and reacting appropriately can help shore up loyalty and maintain sales.
You need a simple and effective way to identify top customers based on profitability, size, or potential. But if
customer and sales data is located in multiple software applications, it could be difficult to access the information and build an overall picture needed for valuable insight into customer behaviors.
Mistake 2: Failing to Capitalize on Market Opportunities
Even with limited investment funds, you still have to make fact-based decisions about short- and long-term opportunities to grow top-line revenues – whether you’re trying to expand into new markets or extend a successful product line.Many great brands were launched during recessionary times and yet able to
capitalize on unique market opportunities.To invest in and seize market opportunities with confidence, you need a solution that helps you see trends and variances, analyze scenarios, and select the right combination of initiatives to maximize your returns.
Mistake 3: Allowing Operational Inefficiencies to Persist
To keep the cost of delivering goods and services in line, you must continually find ways to reduce waste
and eliminate inefficiencies.If operational wastefulness persists, you can lose control of your cost
structures – and that puts pressure on your gross margins. economic woes and restricted cash flows
are forcing companies to analyze cost structures by delving deeper into the information already at their fingertips.
Mistake 4: Letting Problems Go Undiagnosed and Uncorrected
In today’s business environment, an  organization needs to address all outstanding issues – but you must
first identify and prioritize existing problems, then focus your time and energy on the most crucial. no organization wants to wait until a product is drastically behind schedule or a department significantly over budget before taking action. However, if you manually track project or program status, you risk not only wasting time and money on an ineffective approach but also delaying your ability to identify and then
correct problems.
Mistake 5: Driving the Wrong Behavior in the Organization
If your corporate goals aren’t clearly defined, communicated, and measured, you’re missing out on an opportunity to encourage beneficial behaviors. You may improve performance in one department or
division at the expense of overall company performance. Gartner suggests that organizations “show how
performance management efforts will benefit the enterprise if metrics and reporting align to corporate goals.
You need a solution that supports greater alignment, accountability, and performance at both the individual and  the corporate level.s. But companies often find their strategic initiatives disconnected from daily operational goals, leaving employees confused about individual priorities. By clearly assigning accountability for goals and timelines, you can communicate explicit performance expectations throughout the company
and develop incentives that drive needed cultural and behavioral changes.

Mistake 6: Failing to Offer Transparency for Stakeholders
The global economic crisis is leading business stakeholders and governments to demand greater transparency into company finances, operations, decisions, and core performance metrics. However, many organizations find that overly complex reporting hampers their ability to demonstrate compliance or fiscal
health. According to Gartner analyst Bill Hostmann, “most organizations find they do not have the information, processes, and tools needed by their managers to make informed, responsive decisions. Too many enterprises underinvest in their information infrastructure.

Saturday, March 19, 2011

1)IBM Is Said to Pay $10 Million to Settle SEC Foreign-Bribery Allegations.
2)Rising costs in Bangalore, the center of India’s software industry, are prompting some in the U.S. to look more favorably on Hyderabad and other smaller tech hubs there.
3)Microsoft announced today their latest version of Internet Explorer 9 reached 2.3 million downloads in 24 hours.
4)Facebook blamed for eating disorders among girls
5)A former Goldman Sachs Group Inc computer programmer was sentenced to eight years in prison for stealing secret code used in the Wall Street bank's valuable high-frequency trading system.
6)Google employees' '20% time' goes to Japan to crafting Digital Age tools for handling the crisis.
7)Japan's next nightmare-serious health problems if too much radiation is released into the atmosphere.
8)United Kingdom which has one of the world's most reputed educational institutions is also looking forward to send its students to study in India in the areas of science and technology and software development.
9)IBM said it has bagged a 10-year outsourcing deal from Caparo India to provide enterprise resource planning and data centre infrastructure services.
10)HCL Infosystems has bagged an order from the IAF to deploy Wideband CDMA-based portable wireless network at a cost of over Rs 300 cr.
11)Four days into Japan's crisis, India's top tech firms have started bringing back employees and moving work to their centres outside Japan.

Tuesday, March 15, 2011

Internet Explorer 9 is out today, but can it contain Chrome?

Microsoft today released the final version of its Internet Explorer 9 web browser, which has been available in Beta form for the past six months (accruing more than 40 million downloads in that time). Coming almost exactly two years after its predecessor IE8, IE9, now dubbed "Windows Internet Explorer 9", is touted as being the only fully hardware-accelerated HTML5 browser, promising to offer a "faster, richer and more immersive web experience".

Microsoft is hoping the new features will be enough to stem the slow but steady stream of browser users defecting to Google's Chrome, which has rocketed to a strong third position in the market (behind IE and Firefox) in just two and a half years (and which received a comprehensive update itself just one week ago).
Microsoft is presenting IE9 as the most standards-compliant browser it has ever released, with a strong focus on open HTML5 compatibility from the outset. It's also designed to offer a minimalist UI, so as not to get in the way of the user experience. As Microsoft puts it, "ideally, browsers melt into the background and allow websites to come forward and shine".

Microsoft is also emphasising with its new release the tight integration IE9 shares with the Windows OS - and your PC hardware generally - in order to bolster the browsing experience, claiming IE9 browsing becomes "as fast and responsive as native applications installed on your PC".And, in a sign of times, IE9 does not offer support for Windows XP. It will be a significant let-down for many users, but yep: only Vista and Windows 7 (and Server 2008) users need apply. Purportedly this is due to the graphics hardware acceleration features of IE9,which are optimised for newer operating systems.


Predicting the Impact of Large Magnitude Earthquakes

Situation
It may be scientifically impossible to predict the specific day and time of an earthquake,
but researchers at the National Autonomous University of Mexico (UNAM) aren’t taking
no for an answer. They’re just asking a different question.
Instead of looking for “when,” the UNAM researchers are asking “what if.” “We’re not
predicting that an earthquake will actually happen,” says lead researcher Mario Chavez.
“We’re posing ‘what if’ type scenarios such as, if an earthquake of a given magnitude
does hit a specific area how much and how fast will the earth surface move and what
is the probable impact of the earthquake given the region’s existing or projected
infrastructure.”
Using a 3D seismic wave propagation code, the researchers have studied major historic
earthquakes, modeling how seismic waves move through the earth’s crust. As well
as accurately modeling past events, this simulation technology will enable scientists
to study ground motions from hypothetical earthquakes and identify where groundshaking
shocks would be centered in the event of an earthquake.
“Our research means that governments, developers and planners could soon have
access to vital earthquake ground motion data that will enable them to assess the impact
of large or extreme magnitude earthquakes in their own region,” says Dr. Chavez. “This
kind of information could play a major role when working on risk assessment for a
facility site or when designing homes, hospitals, schools.”
Challenge
While realistic 3D modeling of the propagation of large subduction earthquakes has
vast potential, it also poses a numerical and computational challenge, particularly
because it requires enormous amounts of memory and storage as well as an intensive
use of computing resources. At UNAM, the team only had access to a small cluster
that limited them to runs on tens or hundreds of processors for a few hours at a
time, and producing only coarse simulations. For their code to be effective, they
needed finer resolutions only obtainable with more powerful parallel computing.

Solution
Through a collaboration made possible through the Scientific Computing
Advanced Training project (SCAT), a European Commission-funded project
bringing together researchers from six countries, Chavez connected with
computational scientists at the Science & Technology Facilities Council’s (STFC)
Daresbury Laboratory. With access to the Cray XT4 supercomputer HECToR,
the UK’s largest, fastest and most powerful academic supercomputer,
the Daresbury and UNAM teams worked together to optimize the code
for high performance computing and scale it up to more than 8,000
simultaneous processes. “For this project we have made use of the
highest levels of performance on parallel machines, allowing Chavez to
perform one of the few high resolution simulations to an accuracy and
magnitude that has not been done before for this kind of research,”
says Mike Ashworth, associate director of the computational science
and engineering department at STFC Daresbury Laboratory.
Using HPC to model seismic waves

Capturing earthquakes in high-resolution 3D
The research team led by Chavez from UNAM’s Institute of Engineering
set out to study the propagation of seismic waves through the earth’s
crust during major earthquakes, including the devastating magnitude
scale 8 events in Mexico City in 1985 and Sichuan, China in 2008.1 Their
main objective was to produce 3D models of the low-frequency wave
propagation of the particular earthquake and compare the synthetic
seismograms with actual observations.
To do this they applied a 3D parallel finite difference code to simulate synthetic
seismograms. The code is highly suitable for parallel execution on a distributed
memory parallel computer like HECToR that uses explicit message passing
parallelization. In fact, the Cray XT™ systems are designed to optimize MPI
message passing; the SeaStar2+™ chip combines communications processing
and high-speed routing on a single device.
The cluster system, on the other hand, didn’t have the scalability to maximize
the code. “On a cluster computer you can only get a coarse simulation,” says
Dr. Ashworth. “We took it onto the HECToR Cray XT4 system and scaled it up to
thousands of processors. We got very fine resolutions of the Sichuan earthquake.”
The high resolutions only obtainable on an extremely scalable system like the Cray
XT machine give a much finer, detailed view. “The results are much more realistic and
plausible and do a much better job of convincing people that the results are right,”
says Ashworth.
Exploiting the Cray XT4 system’s extreme processor counts did take a few intermediary
steps, however. The code required some optimizations in order to take full advantage of
the machine’s scalability, including vectorization, halo exchange, boundary conditions
and function inlining. But the work paid dividends. “The ultra high resolution we were
able to achieve enables simulations with unprecedented accuracy,” says Chavez.
Still, the resulting high-resolution simulations would be for naught if the results weren’t
satisfactory. The observed and synthetic velocity seismograms need to show reasonable
agreement both in time and frequency. In other words, for the code to be useful as a
damage-predicting tool for future earthquakes, the models and actual results of the
past seismic event being studied need to match. Ashworth reports success in modeling
both the Mexico City and Sichuan earthquakes: “The experiments we’ve done are in
hindcast. We’ve run the model and compared it to actual results and it has matched
up.”
The work of fine tuning the code, scaling up to even larger problems and achieving
finer resolutions continues. It’s a process that will be helped along by recent Cray XT6
upgrades to HECToR. The new Cray XT6 components are contained in 20 cabinets
and comprise a total of 464 compute blades each with two 12-core AMD Opteron
processors for a total of 44,544 cores.
“This research is leading stuff,” says Ashworth. “What this model allows to do is to
show if an earthquake happens in the region, where the potential damage would
be most likely to occur. We think that’s pretty useful in informing policymakers and
governments.”

Friday, March 11, 2011

Battle between the browsers

The battle of the browsers is heating up this week, with Microsoft for the final version of Internet Explorer 9, Firefox of Firefox 4 and Google releasing Chrome 10 .

Switching browsers is not a decision to take lightly, and as Chrome has improved over the last couple years, had a harder time considering other options.

The most noticeable change in Chrome 10 is the Options settings, and they'll be welcomed by tweakers and anyone who ever changes Chrome options. When you click the gear icon in the upper-right corner and select Options (Preferences on a Mac), the menu now opens in its own tab rather than in a relatively small window, as with previous releases, making it easier to find the options you want to change.

More important is that you can now search through Options, so you don't have to hunt around for the feature you want to change. For example, if you want to make changes to any settings related to downloads or passwords, type in one of those terms, and you'll be sent directly to those settings.

As you use Options, the Omnibox (Google's name for the address bar) displays a local URL for your location -- for example, chrome://settings/advanced for advanced settings and chrome://settings/browser for basic settings. In some instances, an individual Option feature or setting will have its own URL, such as chrome://settings/passwordManager for the Password Manager. You can add this to your bookmarks if it's a feature you frequently use, which I found very convenient.

Web Store and Web Apps
Since Google in December, people taken a liking to the whole concept. It's part a discovery tool for new web services, part and part quick launch tool from Chrome's home screen. The Web Store is a case where the whole is greater than the sum of its parts, and now it's something we can't do without. We'll see if when it releases its own web apps platform.

Pinned Tabs
Although IE9 lets you pin tabs to the taskbar in Windows 7, it doesn't let you pin tabs within the browser itself. Both Chrome and Firefox allow this, so your favorite websites can hide in a tiny corner at the top of the browser. The difference is control: Firefox doesn't give you a way to automatically open certain websites as pinned tabs, but with Chrome, any web app can be pinned whenever you launch it from the home screen.

Real Estate
Technically, Internet Explorer 9 affords the most screen real estate for web pages, but it only does so by squeezing tabs and the address/search bar onto the same line. This feels too claustrophobic, and moving tabs down to a separate line consumes a lot more space. Firefox 4 comes close to Chrome, but it's not quite equal.

Bookmarks Bar
I visit a lot of websites on a regular basis, such as news sources and blogging tools, so the bookmarks bar in Chrome is essential. Again, this is a feature that Firefox also offers, but it takes up a little more space than Chrome and lacks a shortcut to toggle the bar on and off (in Chrome, Shift-Ctrl-B).

Wednesday, March 9, 2011

Five Management Concepts that really works

Concept #1: Treat Business as a Series of Relationships
In many companies, executives envision the business world as a battlefield.By contrast, executives who treat business as a series of relationship tend to focus on managing the complexity of the interactions of different organizations and people all trying to accomplish different things.Rather than trying to get them to fight some imaginary enemy, executives try to bring the individuals and organizations into alignment so that they’re working towards a common purpose.

In companies where this concept dominates, managers tend to be more cooperative. Because the emphasis is on relationships, people are more likely to connect one-on-one.
Concept #2: Envision the Corporation as a Community
Many executives tend to think of their company as a vast machine that they need to control. Managers who like the machine analogy tend to create rigid teams with rigid roles and rigid functions. Managers and workers alike become convinced that change is very difficult, similar to retooling a complicated machine.

Such managers tend to think of themselves as “controllers” whose job it is to make sure that people follow the rules of the “system.”
When employees really feel that they’re valued as individuals, they more easily dedicate themselves to the goals of the organization. They’re more likely to truly enjoy contributing to their own success, the success of their peers, and the success of the community at large.
Concept #3: Redefine Management as a Service Position
The natural result of seeing management as a control function is the creation of brittle organizations that can’t adapt to new conditions. Often this happens because multiple managers in multiple stovepiped groups set up conflicting power structures, each of which is trying to “control” what’s going on.
By contrast, when a corporate culture thinks of management primary as a service position, you get coaches rather than dictators. Freed of the burden of attempt to “control things”, managers can more easily set a direction and to obtain the resources that employees need to get the job done.
Concept #4: Treat Employees Like Adults
Sad to say, but many top managers think of their employees as resources wayward children who are too immature and foolish to be assigned real authority, and simply can’t be trusted.When managers think of their employees as adults and therefore as peers, they find it easier to shed the notion that their job is to order employees about.
Employees at all levels take charge of their own destinies and stop acting like crybabies when things go wrong.
Concept #5: Use Technology to Create Flexibility
The introduction of technology into companies that adhere to the old beliefs (like business is a battlefield) has been, by and large, disastrous. What happens in this case is that technology is harnessed to strengthen management’s control and further infantilize employees. What’s worse, the more the technology become a tool of control, the more it’s used to automate processes, casting them in concrete. The end result is a brittle company that finds it MORE difficult to change and adapt. At so that it becomes cast in concrete.The sad thing about all of that is that technology, if applied correctly, can automate repetitive and boring work, thus freeing human beings to be creative, to build relationships and have meaningful conversations. However, that only happens if management gives up the idea of using technology centralizing control.

Tuesday, March 1, 2011

Gmail bug deletes e-mails for 150,000 users

The cloud has failed roughly 150,000 Gmail users, whose e-mails have been deleted and accounts disabled by a mysterious glitch.

Users on Google's help forum report that the Gmail bug responsible deletes everything, including e-mails, labels, folders, and settings. When affected users log on, they see a welcome message as if they've never used Gmail before. Other users simply found their accounts disabled while repairs are being done. According to Mashable, the bug affects less than 0.08 percent of users.

It's not yet clear whether Google will be able to restore the deleted e-mails. Google hasn't addressed the issue on the official Gmail blog, and the Google Apps Status Dashboard only acknowledges an ongoing "service disruption." Eventually, Google will have to explain how this happened and what will become of affected users' accounts.

For everyone else, the lost e-mails are a reminder of how Web storage isn't completely immune to failure. If you rely on Gmail to safeguard e-mailed documents and important correspondences, consider backing up your account.