Thursday, May 31, 2012

And Then Sky Died

We all have Sky stories.  If you've been to downtown Athens more than twice, you have a Sky story. I have dozens.

With Sky's passing, I'm going to say that the toughest May in the past ten years is nearly in the books. That is all.

RIP, Sky.


Athens icon ‘Sky’ Hertwig dead at 60 - AJC

Five Key Factors Drive the Internet Growth Trajectory

Cisco issued results of their annual Visual Networking Index (VNI) Forecast (2011-2016). It's the company's ongoing initiative to forecast and analyze Internet Protocol (IP) networking growth and trends worldwide. The VNI Forecast update projects the significant amount of IP traffic expected to travel public and private networks -- including Internet, managed IP, and mobile data traffic generated by all users.

This year, Cisco has also developed a new complementary study -- the Cisco VNI Service Adoption Forecast, which includes global and regional residential, consumer mobile, and business services growth rates.

By 2016, annual global IP traffic is forecast to be 1.3 zettabytes -- (a zettabyte is equal to a sextillion bytes, or a trillion gigabytes). The projected increase of global IP traffic between 2015 and 2016 alone is more than 330 exabytes, which is almost equal to the total amount of global IP traffic generated in 2011 (369 exabytes).

This significant level of traffic growth is driven by five key factors:
  1. An increasing number of devices: The proliferation of tablets, mobile phones, and other smart devices as well as machine-to-machine (M2M) connections are driving up the demand for connectivity. By 2016, the forecast projects there will be nearly 18.9 billion network connections -- almost 2.5 connections for each person on earth -- compared with 10.3 billion in 2011.
  2. More Internet users: By 2016, there are expected to be 3.4 billion Internet users -- about 45 percent of the world's projected population according to United Nations estimates.
  3. Faster broadband speeds: The average fixed broadband speed is expected to increase nearly fourfold, from 9 megabits per second (Mbps) in 2011 to 34 Mbps in 2016.
  4. More video: By 2016, 1.2 million video minutes -- the equivalent of 833 days (or over two years) -- would travel the Internet every second.
  5. Wi-Fi growth: By 2016, over half of the world's Internet traffic is expected to come from Wi-Fi connections.

The Cisco VNI Forecast Methodology

The annual Cisco VNI Forecast was developed to estimate global Internet Protocol traffic growth and trends. Widely used by service providers, regulators, and industry influencers alike, the Cisco VNI Forecast is based on in-depth analysis and modeling of traffic, usage and device data from independent analyst forecasts.

Cisco validates its forecast, inputs and methodology with actual traffic data provided voluntarily by global service providers and more than one million consumers worldwide. The following Cisco VNI Forecast resources and tools are available online:
  • The updated Cisco VNI Forecast Highlights Tool provides key forecast predictions in short sound bites that can be chosen on a global, regional or country level (these include device, traffic and network speed projections).
  • The Cisco VNI Forecast and Methodology, 2011 – 2016 White Paper provides the full detailed findings of the study.
  • The Cisco VNI Forecast widget provides customized views of the growth of various network traffic types around the globe (revised for this 2011 - 2016 forecast period).
  • The Cisco VNI Service Adoption Forecast White Paper provides a unique view into global and regional trends of next-generation residential, consumer mobile, and business end-user services and applications, underlying addressable markets and relevant devices and connections.
  • The Cisco VNI Service Adoption Forecast Highlights Tool provides primary global and regional takeaways on user and subscriber, device and connection, and service adoption penetration rates.
  • The New Cisco Data Meter application (beta version 1.0) for Android smartphones provides users with the following valuable network-related data: estimated and projected bandwidth consumption, individual app usage, Wi-Fi and cell connection speeds, and the location of nearby Wi-Fi networks.
    Historically, Cisco VNI projections have generally been viewed as conservative; however, the forecast has proven to be quite accurate throughout its six-year history.
    • In the initial 2007 VNI Forecast, Cisco projected an overall IP traffic volume of 28.4 exabytes per month by 2011. The actual volume in 2011 was 30.7 exabytes per month. The actual volume was about 7 percent higher than what Cisco projected five years ago.
    • In the 2008 VNI Forecast, Cisco predicted that in 2010 Internet video would surpass P2P in traffic volume. In 2010, Internet video surpassed P2P in traffic volume -- confirming the Cisco VNI Forecast.

    Monday, May 28, 2012

    Rationing Our Finite Resources… with Our Eyes Open

    Two articles published in the latest NEJM issue are devoted to “our finite resources” and the inevitability of rationing care. First Dr. Howard Brody is attempting to rephrase the issue as one of “waste avoidance” and is delving into the ethics of practicing waste avoidance which is defined as withholding non-beneficial care. Of course non beneficial is not a cut and dry label, so Dr. Brody is envisioning that “[a]n ethical system for eliminating waste will include a robust appeals process”, where “[p]hysicians, as loyal patient advocates, must invoke the process when (according to their best clinical judgment) a particular patient would benefit from an intervention even if the average patient won't”. While the sentiment is commendable, I am not clear who exactly should physicians appeal to “according to their best clinical judgment” and I am not certain where the newly empowered and engaged patient go. Or is all this empowerment and engagement talk just a cruel joke?

    The second article in NEJM, which is authored by M. Gregg Bloche, M.D., J.D. and titled “Beyond the “R Word”? Medicine's New Frugality”, is arguing that it is very unlikely that we could actually cut the waste, which is estimated at 30% of costs, and even if we do, costs will continue to rise and we are just postponing the inevitable. The inevitable, in the kind doctor’s opinion, is denying beneficial care. No beating around the bush any longer. Just say it and do it. Of course, the empowered patients are nowhere to be found in this article either, although it is full of advice for politicians on how to go about stopping therapeutic advances in their tracks, because “we can't afford all the things that medicine can achieve” and “we must make painful choices between health care and other needs”. Sadly Dr. Bloche J.D. did not elaborate on his “other needs”.

    But the most disturbing experience for me was reading the comments posted to both articles, mostly by physicians, and mostly supportive of the need to ration care to patients. One physician commenter in particular seemed to see rationing as “a logical corollary of our responsibilities as physicians” because physicians are citizens before they become physicians and they have duties and responsibilities to their fellow citizens which “are not superceded [sic] by the mythical primacy of the doctor-patient relationship”.
    Well, if we must ration our finite resources, let’s ration them with our eyes open, as Dr. Berwick wisely stated. So here is some eye opening material for your rationing pleasure….

    Random Non-profit Hospital Quotes

    “Millions of dollars have been spent transforming the hospital into a beautiful, soothing, healing environment. Artwork graces the walls and soft music fills the hallways. ……..
    “Hospital food” is also out-of-the-ordinary at Fauquier Health; The Bistro on the Hill provides food options for patients and their guests. Earning a reputation of its own in Warrenton, the Bistro serves everything from brick-oven pizza to gourmet pasta, from a fresh soup and salad bar to shrimp stir-fry. Bakers keep the dessert case filled with fresh cakes and pies, cookies and tarts. Community groups frequently congregate at The Bistro, and catering for off-site events is available.”

    “A six-story pavilion is under construction on the west side of Missouri Baptist Medical Center. The patient tower is part of a major building project that also will see construction of a new Clinical Learning Institute, new entry way, parking lot and nature trails for the campus at 3015 N. Ballas Road. The West Pavilion will add 216,000 square feet to the medical center and will house the medical center's new main entrance lobby; four floors of private patient rooms for surgical patients; four new operating rooms; and a new outpatient surgery center that will connect to the hospital's current operating room suites. "Medicine has changed a lot since 1965 (when Missouri Baptist relocated to its present location), when patient rooms were smaller and they still actually had patient wards," said Jesse Arevallo, facility and campus planning director. "The new standard of care requires a larger building footprint," he said. "This project, creating state-of-the art equipment and facilities, will tremendously benefit patients and the community." The private hospital rooms will allow the medical center to find a new use for its semi-private rooms in the main part of the hospital. "So we're not actually adding beds, but replacing," Arevallo said. The West Pavilion will cost $140 million. It is slated for completion by July 2013.”

    “Patient-centered amenities abound in the new Cancer Center facility
    As patients and families enter the lobby of the Cancer Center facility, they will step into a cozy, living room-style space with fireplace, sofas, chairs and decorative carpeting. A nearby cafĂ© will offer indoor and outdoor seating and a health-focused menu with fruits and vegetables from local farmers. ……  A focal point of the building is a central, five-story, light-filled atrium, which looks down on spiraling artwork filled with inspirational quotes from Duke cancer patients and friends.” 20 

    Highest Paid Non-Profit Hospitals CEOs in the Midwest

    1. Randall O’Donnell; Children’s Mercy Hospital and Clinics; Kansas City, Missouri: $6 million
    2. Javon Bea; Mercy Health System; Janesville, Wisconsin: $4.5 million
    3. James Skogsbergh; Advocate Health Care; Oak Brook, Illinois: $4 million
    4. Dean Harrison; Northwestern Memorial Hospital; Chicago, Illinois: $3.4 million
    5. Richard Pettingill; Allina Health System; Minneapolis, Minnesota: $3.3 million
    6. Joseph Swedish; Trinity Health; Novi, Michigan: $2.7 million
    7. Lowell Kruse; Heartland Regional Medical Center; St. Joseph, Missouri: $2.5 million
    8. Steven Lipstein; BJC Health System; St. Louis, Missouri: $2.2 million
    9. Kevin Schoeplein; OSF Healthcare System; Peoria, Illinois: $2.2 million
    10. Thomas Sieber; Genesis Healthcare System; Zanesville, Ohio: $2.1 million
    11. Paul Pawlak; Silver Cross Hospital; Joliet, Illinois: $2 million
    12. Toby Cosgrove; Cleveland Clinic; Cleveland, Ohio: $1.9 million
    13. William Petasnick; Froedtert Memorial Hospital; Milwaukee, Wisconsin: $1.9 million
    14. Fred Manchur; Kettering Medical Center; Dayton, Ohio: $1.9 million
    15. Patrick Magnon; Children’s Memorial Hospital; Chicago, Illinois: $1.8 million
    16. Kenneth Hanover; University Hospital; Cincinnati, Ohio: $1.8 million
    17. J. Luke McGuinness; Central Dupage Hospital; Winfield, Illinois: $1.8 million
    18. Daniel Evans Jr.; Clarian Health Partners; Indianapolis, Indiana: $1.8 million
    19. James Madera; University of Chicago Medical Center; Chicago, Illinois: $1.8 million
    20. James Anderson; Cincinnati Children’s Hospital Medical Center; Cincinnati, Ohio: $1.8 million

    2009 Insurance Companies Total CEO Compensation

    • Aetna, Ronald A. Williams: $18,058,162
    • Coventry, Allen Wise: $17,427,789 (took over from Dale Wolf)
    • WellPoint, Angela Braly: $13,108,198
    • United Health, Stephen Helmsley: $8,901,916
    • Cigna, David Cordoni: $6,593,921 (took over from CEO H. Edward Hanway)
    • Cigna, H. Edward Hanway: $18,800,000
    • Humana, Michael McCallister: 6,509,452
    • Health Net, Jay Gellert: $3,643,342 

    Following Dr. Brody’s reasoning, I can certainly see some golden opportunities here for ethical waste avoidance, with a proper appeals process in place, of course. And as painful as it may be, some of the beneficial ambiance provided by fireplaces, freshly baked brioche and well-fed executives will need to be cut as well, because Dr. Bloche is largely correct and we do have other needs, which could be addressed by having corporations, whose cash flows allow for millions of dollars in real estate investments and executive compensation, pay their taxes first.

    If you think that this is a liberal overreaction to a couple of academic papers, think again. In the January 2010 issue of NEJM, Dr. Brody suggested that each specialty should pick a “Top Five” list of expensive tests and treatments that should be discouraged. Today we have the ABIM Choosing Wisely campaign of exactly five “don’ts” for each specialty. What I find fascinating is that routine EKGs, for example, are discouraged twice in the ABIM lists, but places like the top ranked Mount Sinai hospital seem to think that having those routine EKGs and much more is a very good idea (if you have $6,000) and will “help you and your associates live longer, healthier lives through the highest measures of preventive care”. So is Mount Sinai with its “award-winning physicians” correct? Or is ABIM correct? Or do we advocate two standards of care based on the size of your wallet?

    As to Dr. Bloche J.D., his Huffington Post bio, suggests that amongst his many degrees and honors he “was a health care advisor to President Obama’s 2008 campaign, as well as the presidential transition, and he spoke frequently for the campaign as a “surrogate.””.  Is he speaking as a “surrogate” (whatever that may be) when suggesting that we must ration beneficial care (Mount Sinai executive customers excluded, of course)?

    I don’t have the answers to these questions, but the implications to non-policy makers are pretty clear. To all the ePatients, participatory patients, engaged and empowered patients out there, I would suggest that you devote as much time to budding policies as you do to obtaining access to your electronic data, because having real time visibility into how your care is being denied will provide very little comfort, even if it is supplied to you in a computable and transferable structured data format.
    To all practicing doctors, if you passively allow yourselves to be represented by learned articles like the ones cited here because you are too busy seeing patients, the day will come when, in the eyes of the public,  your profession will be no different than that of an IRS auditor, including financial and social rewards. Yes, you will most likely retire by then, but this is the Medicine you are going to experience in your old age, and this is the Medicine you are bequeathing to your children and grandchildren, and this is not the Medicine that was entrusted to you by previous generations of physicians.

    Wednesday, May 23, 2012

    Is There a Baseball NIT?

    No other way around it.

    I don't have any analysis other than to say we have had more hits and gotten more opportunities than we have allowed, yet we are at .500 on the season.  Absent every single team that is "supposed" to win their conference tourneys doing so, and some generous selection day decisions, the Diamond Dawgs are done for the season.  

    It will be an off season of serious introspection in the big office at Butts-Mehre. There are some family circumstances David Perno is dealing with, and has been for a bit. I am not as sold on it being the end for Perno as T. Kyle King is, but he has been right about some other decisions McGarity has made, so I wouldn't bet against him.


    Not happening-McGarity Confirms Perno Staying (Gentry Estes)

    What’s Up Doc? Medicare Carrots and Sticks

    The Patient Protection and Affordable Care Act (PPACA) of 2010 mandates that certain administrative simplifications should be made to reduce overhead costs of health care. Since administrative complexity obeys the conservation laws of physics, for every bit of complexity that is removed, a new chunk of bureaucratic complexity must be added to the system. With that in mind, CMS has created and is proposing to grow an array of financial incentives and penalties for health care providers. This collection of carrots and sticks is intended to be used as so many levers to control and fine tune the practice of medicine by encouraging adoption of health information technology, measuring processes and steering physicians to low cost treatment methods.

    Since confusion is abundant, and confusion leads to anger, fear and sometimes outright panic, which in turn causes folks to dump perfectly good private practices on the first hospital that knocks on their door, I thought it would be beneficial to clarify a few things and look at the situation from an objective mathematical perspective. Below are concise descriptions of the current CMS incentives and penalties programs, and a dollar amount evaluation of their possible effect on your bottom line.

    Electronic Prescribing

    A bit outdated in the EHR era, the eRx incentive program began in 2009 and is due to expire in its entirety by 2015, when no bonuses and no penalties will be assessed for this initiative.
    The Rules: Currently, 10 electronic prescriptions in the first half of the year will ward off the penalties for next year, and 25 electronic prescriptions during the entire year will get you the incentive next year. The prescriptions must be written for Medicare patients’ unique visits with associated E&M codes and a “qualified” electronic prescribing system must be used. Any eRx module in a certified Complete EHR will do and if you use a standalone system, make sure it states that it is “qualified” for CMS incentives.
    The Numbers: The incentives are 1% this year and 0.5% in 2013. There are no incentives available after that. Incentives are calculated as a percentage of the total Medicare Physician Fee Schedule (MPFS) allowable charges for the calendar year. The penalties are -1%, -1.5%, -2% in 2012, 2013 and 2014 respectively. There are no penalties after 2014. Penalties are applied as an adjustment to ongoing MPFS payments during the penalty year. Note that you cannot receive eRx and Meaningful Use Medicare incentives in the same year. You can do so for Medicaid EHR incentives.

    Physician Quality Reporting System (PQRS)

    This is the successor of PQRI (the “I” was for initiative) and it started in 2010 in its current format with no proposed expiration date. It is important to keep in mind that the contents of your PQRS reports will be made public on the Physician Compare website maintained by CMS.
    The Rules: You will have to report on at least three clinical quality measures, or one group of measure, for the reporting year. You may report your measures via claims, registries, EHR or a special group reporting tool. The reporting is limited to Medicare patients and although the registry option offers a 6 months reporting period, most other methods require that you report on 30% to 80% of pertinent patients for the whole year. If you choose claim reporting, make sure you don’t let charges entered close to year-end linger around, because CMS may not get them in time to calculate your incentive. Both Incentives and penalties are calculated and applied as they are for the eRx program.
    The Numbers: You could have gotten a 1% incentive in 2011, but starting in 2012 and through 2014, the incentive is a constant 0.5%. There are no incentives authorized after 2014. Penalties begin in 2015 with -1.5% and continue to -2% from 2016 and beyond. There is no end date for the penalties. PQRS incentives are independent of eRx and Meaningful Use and may be combined with either one.

    Maintenance of Certification (MOC)

    The MOC program is only available for those who successfully report PQRS measures and is available only through the incentives phase of PQRS.
    The Rules: You need to participate in a Maintenance of Certification program and complete a practice assessment more frequently than is required to qualify for or maintain board certification. Make sure that your board is indeed qualified by CMS for this program, since not all are.
    The Numbers: This is a very simple program that will pay an additional 0.5% of MPFS to what you already receive for PQRS reporting. The program expires after 2014 and there are no penalties associated with it.

    The EHR Incentives Program (Meaningful Use)

    Saving the best for last, this is the big one and most advertised one. The Meaningful Use program started in 2011 and is projected to continue indefinitely. It has been likened to an escalator, where the requirements become more comprehensive and more complex every two or three years.
    The Rules: You must buy an ONC Certified Complete EHR (or a collection of certified modules) and meet a set of required measures every calendar year. The measures are adjusted every two (or three) years, from the current Stage 1 to future Stages 2, 3 and presumably others. There are two tracks for this program, one for Medicare and one for Medicaid participants. Meaningful Use is a very comprehensive set of measures reaching into every aspect of medical practice and is inclusive of both electronic prescribing and the reporting of clinical quality measures. The EHR incentives program and the electronic prescribing program are mutually exclusive under Medicare incentives.
    The Numbers: The program offers 5 years of decreasing incentives followed by incrementally increasing penalties for non-participation. The maximum incentives under the Medicare track is $44,000, plus 10% of that if you practice in a designated health professional shortage area, and $63,750 for the Medicaid track. You can join the Medicare track as late as 2014 (you will lose about half the incentive) and the Medicaid track can be started as late as 2016 with no loss of incentives. However, in 2015 penalties, in the form of adjustment to your Medicare allowed charges, will begin to apply for those not participating in either track. CMS is proposing to backdate the penalties, so they apply in 2015 to those who have not become Meaningful Users by October 1st of 2014, effectively moving up the compliance date mandated by legislation. The penalties start at -1% of MPFS in 2014 and increase by 1% every year until they reach -5% in 2019 and continue at the -5% level indefinitely.

    Bottom Line

    For illustration purposes, let’s say you see 10 Medicare patients every day, you work 5 days every week and 50 weeks every year in a health professional shortage area. Accounting for different E&M charges, you are looking at approximately $200,000 per year paid to you by Medicare, and clearly this is a best case scenario. Let’s further assume that the proposed reimbursement cuts and the proposed increases to primary care reimbursement balance each other out and your Medicare revenue stays flat in today’s dollars. How will the carrots and sticks affect your income?
    Scenario 1: You do all that is required and are rewarded with nothing but carrots between 2011 and 2020. In addition to your claims reimbursement, you will receive from CMS $57,400 over the current decade, which is $5,740 per year or $478 per month, totaling less than 3% increase in your average Medicare reimbursement.
    Scenario 2: You ignore all CMS programs and do your own thing, and stick with your decision through 2020. You will of course not get any incentives and you will lose a total of $72,000 over this decade, or an average of $7,200 per year, which is $600 per month, or the equivalent of 3.6% of your Medicare revenue over 10 years.

    It is important to note that, while the incentives are temporary, the penalties are applied indefinitely, converging to 7% MPFS, or $14,000 per year in our imaginary scenario. The sticks are larger than the carrots. These numbers do not bring into account costs of opportunities lost, such as performance bonuses and additional payments per-member-per-month that are becoming available from private and public payers for special endeavors such as medical homes and other quality improvements, and require the same infrastructure be exercised in the same fashion as the Medicare incentives programs described above.

    But wait, there is more: The Value Based Payment Modifier (VBPM)

    If you are fortunate enough to practice in Iowa, Kansas, Missouri, or Nebraska, you are part of a preamble to a new Medicare program which proposes to add a modifier to your charges based on the ratio of cost to quality for services rendered to Medicare members. So far physicians of all specialties in the selected pilot States have received Quality and Resource Use Reports (QRURs) outlining their performance and costs based on 2010 claim submissions. The program is in its definition stage and there are no clear numbers associated with the proposed modifier, and no explanation on how such modifier would be calculated, but it seems that by far, this is going to be a much more significant stick or carrot than anything outlined above. The legislation mandates that this program goes into effect in 2015 and by 2017, most physicians paid under the MPFS will see the VBPM applied to claims they submit to Medicare. [More on the VBPM in a future post…]

    These are the visible carrots and sticks. Now it’s your turn to do the math and make your decision. Note that Scenarios 1 and 2 above are the extremes. You can always jump on the wagon at a later point, with less incentive and/or less penalties. The wagon, though, is accelerating pretty quickly.

    It Must Have Been Tuesday in Hoover

    So the Diamond Dawgs must have lost their opening round SEC tournament game. Dawgs lose opening to Vandy 4-1.  I'd love to say there is one reason for it, but other than one stretch in the third inning, the Dawgs did all they had to defensively to win.  Offensively, Vandy got the same number of hits, but scored three more runs.  Pretty much a microcosm of the season: Play good, but not great, and lose to a team they played evenly with.

    Now, the Dawgs find themselves in exactly the same situation as last season, having lost to Vandy in the Tuesday game and needing to play deep into the weekend to play next week. Going into the SEC tourney, Georgia needed to win a couple last year to not have to sweat too much on selection day.  I still think a win over South Carolina and a win over Florida will probably get the Dawgs in, just like last year.  Both of those programs were very good last year.  Both are very good this year. If it is Vandy and one of those other teams we beat to play until Saturday, I don't know.  Being one or two games over .500 and making the NCAA regionals requires pretty strong late season statement wins. Beating Vandy won't fill that requirement.

    By no means is it a lost cause. Unless we start getting hits when we need to, however, this season will end in Hoover instead of in an exciting regional final somewhere.


    Sunday, May 20, 2012

    At Least We Made the SEC Tournament

    While Georgia Baseball was a 'half game or so' out of fifth or sixth place, being eighth seed isn't a good place to be.  Basically, they'll have to beat Vanderbilt, the Vanderbilt that nearly ended with a losing season record, but is now seeded fifth due to a strong late season run that began with two late game wins over Georgia in late March, then South Carolina, then Florida just to have a shot of getting to the NCAA regional this year.

    I'll submit this with out any comment other than to say we can still do it.  We've played good enough against all of the teams in the conference that anything can go down. We've also played poorly enough that we could lose to Vandy and Auburn and be done by Wednesday afternoon.

    Either way, we still have to win at lest three games in a row to feel comfortable about making the tournament.


    Keep Your Enemies Closer

    One thing I lost in all the kerfuffle about the SEC/Big12 champions bowl is an increased likelihood the two will play some cross conference games to bolster in season schedule strength. It only seems to reason that with increased pressure to get teams in the playoff or whatever that both conferences have something to be gained from adding a marquee match up or three across the conferences.

    If money is going to be the primary driver in this deal, then getting A&M-Texas back or a yearly rotation of Oklahoma/Okie State/West Virginia vs. LSU/Alabama/Georgia or whatnot isn't out of the realm of possibilities. While this is countering Jim Delany's Pac1X alliance with the same move, it works due to the current place the SEC and Big12 are in among the college football firmament.

    I know there are folks panicking about the possibility of another tough football game on the schedule. I, for one, welcome it. As a fan, I get much more excited about playing top tier teams than I do MAC Southern Conference teams.


    Saturday, May 19, 2012

    Don't Be Ridiculous, Lemon

    Dude. A playoff? Really?

    I've uttered those words at least twenty times over the past four months years. Yes, I know, the WHOLE world is demanding a college football playoff. Except those of us that don't want one.

    Blutarsky has been at the blogging forefront of the anti-playoff resistance. Like Blutarsky, I know the vast majority of the arguments for a playoff are shortly translated to there is more money in it for the conferences. Hey, I'm a Jack Donaghy capitalist. Being such, I also believe that quality of product is more important than doing what merely brings in more money, as long as you are making enough money to know the name of the secret European country only rich people know about.

    Now, Mike Slive drops the big one and brings Chuck Neinas Bob Bowlsby and his Big 12 from the realm of the dead to a likely defacto full time seat at the semi-final table. The SEC Championship will soon be a national quarterfinal. The genius of Slive's plan is cutting the ACC and Big East and Notre Dame out of the game. If Roy Kramer is seen as the father of the Conference Championship Game. Slive may well be seen as the father of the Super Conference. Oh, I know Delany made a move to get some teams first (there is also a Schadenfreud thing about Nebraska in all of this I didn't think about), but this move will hasten the coming tide of new conference expansion.

    A couple of other thoughts have come to mind.  First, will the teams play in the Sugar Bowl? Fiesta Bowl? Rotate? Play at another place? Second, hasn't the SEC given the Big 12 champion an easier road into the BCS or whatever championship game, since the Big 12 doesn't currently have a championship game? Third, when will South Carolina propose that the conference representative in the SEC/Big12 bowl game be the team with the best record in September? Fourth, is Bob Bowlsby a vampire?

    As you can tell, there are a lot of details to be worked out.


    Thursday, May 17, 2012

    Top 10 BYOD and Virtualization Market Insights

    Like it or not, some enterprises have already entered a post-PC world -- where their business communication network must accommodate new user-driven choices. These include traditional applications, mobile apps, social apps and operating systems; various server architectures; and an array of mobile devices ranging from smartphones to tablets and other mobility tools. Are you experiencing this phenomenon? If not, you will soon. Moreover, this latest business technology trend has huge ramifications.

    Cisco’s Internet Business Solutions Group (IBSG) conducted extensive research and analysis to uncover key insights about BYOD (“bring your own device”) and desktop virtualization trends in U.S. enterprises. The Cisco IBSG Horizons BYOD and Virtualization study surveyed 600 enterprise IT leaders from 18 industries.

    The Top 10 Market Insights

    Insight 1: Mobility Is Pervasive
    • Seventy-eight percent of U.S. white-collar employees use a mobile device (e.g., laptop, smartphone, or tablet) for work purposes.
    • Respondents indicated that 65 percent of white-collar workers in their organizations require mobile connectivity to do their jobs.
    • Forty-four percent of knowledge workers telecommute at least once per week.
    • Cisco IBSG estimates that telecommuting once a week saves $2,500 per employee annually.

    Insight 2: Growth of Mobility Has Impacted IT Profoundly
    • By 2014, the average number of connected devices per knowledge worker will reach 3.3, up from an average of 2.8 in 2012 (18 percent increase).
    • On average, mobility initiatives will consume 20 percent of IT budgets in 2014, compared to 17 percent in 2012.

    Insight 3: How Much Longer Will Traditional Funding Models Exist?
    • Sixty-two percent of respondents’ organizations pay for both employees’ devices and their voice/data plans.
    • Seventy-five percent of respondents expect the share of employee-owned devices connected to company networks to increase “somewhat” to “significantly” over the next two years.
    • Forty-one percent of respondents indicated a majority of smartphones connecting to their company network are actually employee-owned.
    • According to Cisco IBSG, employees are willing to invest to improve their work experience. Cisco BYOD employees, for example, pay an average of $600 for their preferred devices.

    Insight 4: BYOD Is Here, and It’s Not a Bad Thing
    • Eighty-eight percent of surveyed IT leaders perceive growing technology “consumerization” in the enterprise.
    • Seventy-six percent consider consumerization “somewhat” or “extremely” positive for their companies.

    Insight 5: BYOD Delivers Several Benefits to the Enterprise
    • Among respondents, the top two perceived benefits of BYOD were improved employee productivity (more opportunities to collaborate) and greater job satisfaction.
    • The benefits of BYOD vary based on an employee’s role and work requirements. Cisco IBSG estimates that the annual benefits from BYOD range from $300 to $1,300, depending on the employee’s job role.

    Insight 6: BYOD Does Bring Its Share of Challenges
    • Respondents cited the top challenges of BYOD as (1) ensuring security/privacy of company data and (2) providing IT support for multiple mobile platforms.
    • Thirty-six percent of respondents said that their organizations’ IT departments provide full support for employee-owned devices connected to the company network, with an additional 48 percent indicating that their IT departments support selected devices. Eleven percent said that their companies tolerate employee-owned devices but don’t support them, and just 5 percent said their organizations forbid employee-owned devices.
    • According to Cisco IBSG, 86 percent of BYOD costs are non-hardware-related, highlighting the importance of choosing the right governance and support models to control these costs.

    Insight 7: Employees Want To Control Their Work Experience
    • Employees are turning to BYOD because they want more control of their work experience, thus improving productivity and job satisfaction.
    • Forty percent of respondents cited “device choice” as their top BYOD priority (the ability to use their favorite device — anywhere).
    • Respondents’ second BYOD priority is the desire to perform personal activities at work, and work activities during personal time.
    • Employees also want to bring their own applications to work. Sixty-nine percent of respondents said that unapproved applications — especially social networks, cloud-based email, and instant messaging — are somewhat to much more prevalent today than two years ago.

    Insight 8: Desktop Virtualization Is on the Rise
    • Desktop virtualization enables employees to enjoy a similar experience across a broad range of devices — from desktop and laptop PCs to smartphones and tablets. This capability is alternately referred to as virtual desktop infrastructure (VDI), hosted virtual desktop (HVD), desktop as a service (DaaS), and server-based computing.
    • Eighty percent of respondents indicated that they are “very aware” of desktop virtualization, and 18 percent said they are “somewhat aware.”
    • Sixty-eight percent of respondents agreed that a majority of knowledge worker roles are suitable for desktop virtualization.
    • Fifty percent noted that their organization is in the process of implementing a desktop virtualization strategy.

    Insight 9: Desktop Virtualization Also Poses Challenges
    • While 70 percent of IT leaders recognize that half or more of their organization’s employees could benefit from desktop virtualization, they also expressed some concerns.
    • Respondents’ top concern (33 percent) was data protection — ensuring that only the right people have access to sensitive company and customer data. The No. 2 concern was business continuity — the ability to continue operations under adverse conditions, such as interruptions due to natural or man-made hazards.

    Insight 10: Desktop Virtualization Will Impact Much of the Business
    • Desktop virtualization is already making its mark and will continue to have a significant impact on enterprise business. Survey respondents noted the following as the three areas that will benefit most from desktop virtualization: (1) business continuity, (2) employee productivity, and (3) IT costs.
    • Among devices, respondents listed their top desktop virtualization priorities as laptops (81 percent), desktops (76 percent), smartphones (64 percent), and tablets (60 percent).
    • Survey respondents stated that the top four job roles being targeted for desktop virtualization are (1) field-/customer-facing employees, (2) employees who handle sensitive company data, (3) employees who work from home frequently, and (4) executives.
    • Desktop virtualization and BYOD are changing the way applications are provisioned to employees. For example, 35 percent of respondents said that employees can download only pre-approved applications from the company app store, while 23 percent indicated that both approved and nonstandard applications are available from the company app store.

    Friday, May 11, 2012

    NwHIN: Government Governance of Governances

    Today, the Office of the National Coordinator for Health Information Technology (ONC) has released a Request for Information (RFI) regarding the governance of a Nationwide Health Information Network (NwHIN). The document outlines ONC’s current thinking on the subject and poses 66 questions to the public. The NwHIN is the proposed vehicle by which secure and presumably trusted health information exchange is facilitated and accelerated. The NwHIN consists, or is envisioned to eventually consist, of a set of standards and policies to govern health information exchange over the Internet. It does not include the actual infrastructure for such exchange. Below is a brief summary of the RFI highlights and the obligatory commentary on the proposed governance methodology.


    The 64 page document, as its title clearly states, is focused on creating trust in the exchange of health information at a National level. To that end, ONC is proposing to define a set of policies and regulations to be adhered to by participants in information exchange as “conditions for trusted exchange” (CTE). Consistent with current direction and the funding of Health Information Exchange (HIE) organizations, ONC is envisioning a set of entities specifically built for, or specializing in, the exchange of health information. These new entities (or new services) are named Nationwide Health Information Network Validated Entities (NVEs), and very much resemble what was previously referred to as Health Internet Service Providers (HISPs) in the context of the Direct Project based exchange.

    Going forward, ONC proposes to assume responsibility for “oversight of all entities and processes established as part of the governance mechanism”, including management and endorsement of CTEs, “selection and oversight processes for an accreditation body that would be responsible for accrediting organizations interested in becoming validation bodies” and “[a]uthorizing and overseeing validation bodies which would be responsible for validating that eligible entities have met adopted CTEs”. For starters, ONC proposes three types of CTEs with the understanding that many others will be added in the future. Here is an (almost) verbatim list of the proposed CTEs:

    [S-1]: An NVE must comply with a good portion of the HIPAA regulations as if it were a covered entity.
    [S-2]: An NVE must only facilitate electronic health information exchange for parties it has authenticated and authorized, either directly or indirectly.
    [S-3]: An NVE must ensure that individuals are provided with a meaningful choice regarding whether their Individually Identifiable Health Information (IIHI) may be exchanged by the NVE.
    [S-4]: An NVE must only exchange encrypted IIHI.
    [S-5]: An NVE must make publicly available a notice of its data practices describing why IIHI is collected, how it is used, and to whom and for what reason it is disclosed.
    [S-6]: An NVE must not use or disclose de-identified health information to which it has access for any commercial purpose.
    [S-7]: An NVE must operate its services with high availability.
    [S-8]: If an NVE assembles or aggregates health information that results in a unique set of IIHI, then it must provide individuals with electronic access to their unique set of IIHI.
    [S-9]: If an NVE assembles or aggregates health information which results in a unique set of IIHI, then it must provide individuals with the right to request a correction and/or annotation to this unique set of IIHI.
    [S-10]: An NVE must have the means to verify that a provider requesting an individual’s health information through a query and response model has or is in the process of establishing a treatment relationship with that individual.
    [I-1]: An NVE must be able to facilitate secure electronic health information exchange in two circumstances: 1) when the sender and receiver are known; and 2) when the exchange occurs at the patient’s direction.
    [I-2]: An NVE must follow required standards for establishing and discovering digital certificates.
    [I-3]: An NVE must have the ability to verify and match the subject of a message, including the ability to locate a potential source of available information for a specific subject.
    Business Practices
    [BP-1]: An NVE must send and receive any planned electronic exchange message from another NVE without imposing financial preconditions on any other NVE.
    [BP-2]: An NVE must provide open access to the directory services it provides to enable planned electronic exchange.
    [BP-3]: An NVE must report on users and transaction volume for validated services.

    Considering the broad spectrum of CTEs, the entities accredited to validate NVEs will need a very broad range of capabilities to do a proper job at validation and monitoring of exchanges. ONC allows for the possibility that NVEs may be fully or partially validated, similar to EHRs being certified as Complete or Modular, and in both cases it is assumed that NVEs will be able to publicly advertise their compliance status. All these definitions are in a proposal stage, and ONC is requesting input on pretty much the entire proposed structure. You have 30 short days to file your response.


    This is a very technical subject and, with the notable exception of those actively working in health care IT, this publication may not elicit any interest in the physician or patient population. However, there is one item in this RFI which prompted me to hurry up and write this post, because after consistently complaining for several years, my wishes have been answered in the form of the beautiful [S-6] CTE!! So here are my impressions of this lovely thought and the document that surrounds it.

    The Exquisite
    After what seems like an eternity, ONC officially recognizes that de-identified information can be rather easily re-identified and that those who happen to own the hardware infrastructure where people’s medical records are stored do not have an inherent right of ownership to those records. I would very much like to see ONC extend this regulation to every HIT vendor, not just those specializing in exchange of information, since if it is pertinent to NVEs, it must be also pertinent to EHRs, HIEs, ancillary software vendors and, yes, pharmacy software vendors. I am not naive enough to believe that CTE [S-6] will survive the rule making process, but for the moment, the detailed description of the dangers inherent in the wholesale of patient data is reason for celebration.

    The Good
    All Safeguards CTEs (with the exception of [S-9], which could cause havoc in the many places where data originated from), are proposing to put in place regulations that are beneficial to the privacy and security of patients and their medical information. The Interoperability CTEs are also very sensible and actually a bit restrained. Put together, these 12 CTEs, if complied with, should create enough trust in exchanging entities to allay the concerns of physicians and patients regarding the transfer process itself. Other concerns may persist, but it was not the intent of this RFI to address those. Releasing an RFI prior to a formal notice of proposed rulemaking (NPRM), is also a positive sign that ONC is open to considering other opinions (too bad that this is how [S-6] will be killed off). So, even if you don’t clearly see your dog in this fight, read the document (it’s very readable and informative), find your dog, and back him up.

    The Bad
    The Business Practice CTEs are overreaching into the world of private business. ONC is asking if NVEs should perhaps be required to be non-profit. Not a good idea, but even if they are, those entities will need to have a sustainable business model, or forever be dependent on Government grants. If their dreams of making billions from health data are to be crushed, then they must be allowed to make a living by selling services. Current hype notwithstanding, software is not free to develop and maintain in a professional and trustworthy manner. The reporting CTE [BP-3] sounds too much like big government and should not be necessary. Most vendors are incessantly advertising their number of customers and transactions, and perhaps statistics is something NVEs should be paid for to provide.

    The Ugly
    Bureaucracy, lots of it, expanded and extended indefinitely into the future with no end in sight.

    And now we wait for the public comments to be submitted, the NPRM to be published, more public comments, the final rule to be issued, and the “governance of governances” to be established. Keeping my fingers crossed for little [S-6] to make it to the finish line….

    Thursday, May 10, 2012

    How CIOs Migrate their IT Applications to the Cloud

    As the role of cloud computing is growing around the globe, many CIOs and other senior IT decision makers are facing challenges with their existing network infrastructure -- to support the migration of their business applications to the cloud. A new international study by Cisco Systems revealed the ongoing challenges associated with public or private cloud deployments.

    These latest research findings provide insight into the current state of cloud service adoption and the chasm between IT expectations and network realities. The survey also examines the experiences of IT professionals regarding the level of difficulty and time required to update their networks and migrate their applications to the cloud.

    The 2012 Cisco Global Cloud Networking Survey addresses the applications that are most critical for businesses to move to a cloud services delivery model, as well as the network challenges and potential disruptions and road blocks they are facing during this process. The report also takes a closer look at the typical length of these cloud migrations, and how confident IT professionals are in the ability of their own network deployments to securely deliver an optimal cloud application experience.

    Among its findings, the study reveals that updating the network is one of the top focus areas for cloud migration. In order to successfully move more applications to the cloud, the majority of respondents cited a cloud-ready network (37 percent) as the biggest infrastructure element required for further cloud deployments, ahead of a virtualized data center (28 percent) or a service-level agreement from a managed cloud service provider (21 percent).

    This data expands on the Cisco Global Cloud Index, which predicts that more than 50 percent of computing workloads in data centers will be cloud-based by 2014, and that global cloud traffic will grow over 12 times by 2015, to 1.6 zettabytes per year – the equivalent of over four days of business-class video for every person on Earth.

    Key findings from the global market study include:

    Cloud Deployments in Perspective
    • Almost two in five (39 percent) of those surveyed said they dread network challenges associated with private or public cloud deployments so much that they would rather get a root canal, dig a ditch, or do their own taxes.
    • At the same time, nearly three quarters (73 percent) feel they are confident with enough information to begin their private or public cloud deployments. However, the remainder (27 percent) feels they have more knowledge about how to play Angry Birds than the steps needed to migrate their company's network and applications to the cloud.
    • In a clear sign that many IT organizations are still considering and planning cloud migrations, nearly one quarter (24 percent) of IT decision makers said that over the next six months, they are more likely to see a UFO, a unicorn or a ghost before they see their company's cloud migration starting and finishing.
    • Without proper processes and planning, more than one quarter (31 percent) said they could train for a marathon in a shorter period of time than it would take to migrate their company's applications to the cloud.
    • A majority (76 percent) predict their cloud applications are likely to be breached, yet only one quarter (24 percent) are confident to the point in which they believe the odds are better for them to be struck by lightning than have their cloud applications breached by an unwanted third party.

    Cloud Deployments Expected to Increase Significantly by the end of 2012

    • Presently, only 5 percent of IT decision makers have been able to migrate at least half of their total applications to the cloud. By the end of 2012, that number is expected to significantly rise, as one in five (20 percent) will have deployed over half of their total applications to the cloud.

    Most Critical Infrastructure for Cloud Deployments

    • In order to successfully move more applications to the cloud, the majority of respondents cited a cloud-ready network (37 percent) as the biggest infrastructure element required for further cloud deployments, ahead of a virtualized data center (28 percent) or a service-level agreement from a cloud service provider (21 percent).

    Top Infrastructure Roadblocks to Cloud Migration

    • During the cloud migration process, data protection security (72 percent) was cited as the top network challenge or roadblock responsible for preventing a successful implementation of cloud services, followed by availability/reliability of cloud applications (67 percent), device-based security (66 percent), visibility and control of applications across the WAN (60 percent) and overall application performance (60 percent).

    Top Choice of Application for Cloud Migration

    • If given the choice of only being able to move one application to the cloud, most respondents would choose storage (25 percent), followed by enterprise resource planning (ERP) applications to manage HR, customer relationship management, supply chain management, and project management systems (20 percent). Email (16 percent) and collaboration solutions (15 percent) followed.

    Reality Check: Status of Cloud Application Migration

    • When asked which applications have been moved, or are being planning to be moved to public or private clouds in the next year, the majority of IT decision makers cited email and Web services (77 percent), followed by storage (74 percent) and collaboration solutions such as Web conferencing and instant messaging (72 percent).

    Monday, May 7, 2012

    Meaningful Use Stage 1 - Redux

    Escher - Relativity, 1953
    The public comments for the proposed rules for Meaningful Use Stage 2 are closing now. After reading the public submissions of several organizations, I decided not to comment, since what I wanted to say was covered by much better heeled organizations, and I am not convinced that individuals can influence government in any shape or form nowadays. Instead, I thought that this would be a good time to look back at Meaningful Use Stage 1 and see if there are any lessons to be learned, something that CMS did not deem necessary to do before moving up the escalator to Stage 2. This is a bit peculiar considering that only 5% of physicians were able to attain Meaningful Use so far, and according to a new study published in Health Affairs, less than 14% of physicians said that their EHR has all the bells and whistles necessary for Meaningful Use Stage 1. Since Meaningful Use Stage 2 is largely an amplified version of Stage 1, it may be useful to look back and identify the most troublesome aspects, prior to significantly increasing their magnitude.

    Computerized Physician Order Entry (CPOE) – On the surface, asking that 30% of prescriptions be entered in the EHR seems like an almost trivial task. Where else would you enter them? A closer look reveals that there is some complexity in the definition of the denominator for this measure, and the requirement is that 30% of patients who have at least one active medication on their medication list should be prescribed at least one item using CPOE. Does this mean that the doctor must prescribe something for a third of patients seen? While this may make sense for a primary care doctor with an elderly panel, it does not make sense for other specialties, where patients have most of their medications prescribed elsewhere. And the exclusion for those who write less than 100 scripts per reporting period, which is now one full year, is not an adequate answer either. Perhaps a better measure would be to require a much higher percentage of all prescriptions written, to be entered through the CPOE module, regardless of whether the patients come in with existing meds or not. This is not an easy thing to measure though, but it could be done if EHRs were required to have the prescriber identified in the medication list.

    Clinical Summaries – Probably one of the most confusing measures for both patients and physicians requires that 50% of patients be given a visit summary within 3 days of the encounter. This is reasonable and even a bit lax if you have at least 50% of patients using your patient portal. If not, and if you are actually handing out printed summaries, the measure makes little sense. First, the 3 days period is completely contrived. You either give people a summary when they walk out the door, or you don’t. How many patients do you know who will return in the next couple of days to the office to pick up their visit summary? Or how many patients would hang around the office waiting a couple of hours to get that summary? Zero. You either print the thing out before patients check out, or the patient will leave without it. Printing summaries at the end of the visit means that all components have been updated in the EHR during the visit. If you still dictate most of your note, or if you finish notes at the end of the day, or between patients, your summaries may not be accurate. The second problem with this measure is that not all patients want a summary, but Meaningful Use is not making any allowances for patient preferences. So what do people do? They either print all summaries on paper and shred the ones left behind, or print them to electronic file (to trigger the EHR count) and only generate paper if the patient wants a summary. A much more realistic measure would require that summaries be given to 100% of patients who request one during the visit and 100% of summaries are made available on the portal within 48 hours of the encounter. Click fewer boxes, kill fewer trees.

    Electronic Copy of Medical Records – This most peculiar measure requires that half of all patients requesting medical records are accommodated within 4 business days. Why only half? HIPAA guarantees the right of all patients to obtain copies of their medical records. Meaningful Use requires an electronic option. If the EHR is capable of packaging a chart in electronic format, why would you only give a copy to half of the people who want it? How does CMS know who wants a copy of their records? There is a checkbox of course, and if there’s that checkbox, and if you clicked it, chances are very good that you will also click the button to generate a chart export, which should give you a perfect 100% score, which should have been a requirement to meet this measure.

    Public Health Reporting – There are two public health measures to choose from in Stage 1, immunizations data and syndromic data, and only a test of capability was required, even a failed test was just fine. Should have been a slam-dunk, but it was not. Most public health entities were not ready to receive electronic data and most certified EHRs were incapable of transmitting anything in spite of being certified to that capability. I have no suggestions for how to improve these measures other than refraining from requiring nonexistent things and ensuring that EHR certification is slightly more than fee-for-rubber-stamping.

    Medications Reconciliation – This action is required to occur for 50% of care transitions. Since certified EHRs are not required to provide true electronic reconciliation of two datasets and since one almost never has a structured second medication list from an outside source, the best case scenario consists of clicking on two boxes – one at the front desk, designating the visit as a “transition of care”, and one during the visit, validating that the physician (or assistant) opened the medication list page. This measure should have been postponed until the ability to measure, and the tools to actually perform reconciliation, become available.

    Security Risk Review – Nobody knows what that is. It is, however, a wonderful opportunity for IT guys to relieve small practices of anything between $2,000 and $5,000, and suggest that the server should reside in a cabinet and that the virus protection software should be updated. Why is CMS getting itself involved with HIPAA and security is a mystery to me, considering that these things are under the purview of the Office of Civil Rights (OCR), which is actively pursuing the matter independently of Meaningful Use. This measure should not have existed.

    Clinical Quality Measures – Much has been written about the inadequacy of the chosen quality measures for anything but primary care. Also the seemingly multiple choices of measures are largely theoretical because certified EHRs are not required to certify for all measures and users are basically stuck with whatever the EHR vendor chose to certify, whether the certified measures are applicable to the practice or not. For example, one very popular EHR only has diabetic menu measures available. If you are, say, a dermatologist, you will have to report weight management as part of the core and also dismal measures for your ongoing management of HbA1c. The other major problem with clinical quality measures is their hidden complexity and the not immediately obvious data elements required to calculate the measure and particularly the exclusion of patients from a given measure. A very interesting example of how some EHRs deal with such complexity is the new and fairly common checkbox next to any given diagnosis to mark the condition as terminal in less than 6 months, since this is an exclusion to the weight management core measure. Seriously? And this should be made available on the patient portal too? Clinical quality measures should be carefully reconsidered and dialed back to a sensible set that is truly meaningful.

    Meaningful Use Stage 1 created a lot of confusion amongst providers trying in earnest to meet the measures and gobbled up scarce resources in organizations big and small. Many of the measures are being attested to with very little confidence in accuracy, definition and meaningfulness. It would have probably been a very good idea for CMS to go out there and survey its existing and potential meaningful users to seek some authentic guidance, instead of relying on professional advocates (and occasional testimony from carefully selected users), before putting in place the next theoretical step on the now famous escalator.

    Cloud-Based Collaboration Services in Asia-Pacific Gov

    Government agencies are reportedly one of the primary benefactors of managed cloud services. According to the results from a recent market study by IDC, IT decision-makers across the Asia-Pacific region (excluding Japan) found that 59 percent of public sector respondents are confident in the ability of their internal IT departments to deploy private cloud environments.

    However, IDC Government Insights cautions that high levels of private cloud adoption may not bode well for a collaborative and citizen-engaging government and preemptive measures should be taken for collaboration to take place across organizational boundaries. More insights can be found in the IDC report entitled, "Cloud Computing for Government: a View from Asia-Pacific."

    Frank Levering, Research Manager for IDC said, "An efficient and productive internal IT department is definitely a good to have in any organization, private or public. However, a department that is highly confident in running its own private cloud environment may run the risk of not reaching out to other internal departments to collaborate on cloud opportunities."

    To counter this possibility, IDC recommends that whenever possible, governments should consider cloud-based collaboration services rather than independent private cloud solutions.

    Although governments will initially be seeking cloud-based solutions to deliver cost advantages and better manage resources, eventually, cloud implementations need to be about inter-department collaborations and citizen relationship management in order to reap the full benefits of its capabilities to deliver optimal citizen services.

    This is particularly important for key initiatives like data classification for security purposes; if agencies do not align their security levels, it would prove to be a massive obstacle for future joint efforts.

    A positive sign is that governments across the region are growing to recognize the need for collaboration within the cloud space. There is already a significant installed base of collaborative applications in the cloud and the numbers will grow significantly in the next 12 months.

    To optimize the benefits of cloud services, IDC offers recommendations to governments:
    • Evaluate all aspects of cloud computing. Read everything you can get your hands on. Most suppliers will have recognized that the key to their long-term success is their short-term role as an educator. Since security is a big concern, develop security profiles for all suppliers being considered.
    • Service-oriented architecture (SOA) first, then cloud. The right SOA needs to be in place to facilitate a smooth connection to external cloud services. Government agencies needing to build a robust SOA require a plan that tackles the transition in bite-size pieces while solidifying long-term migration to the shared services architecture. Remove the key barriers to cloud computing.
    • Challenges like security concerns and decentralized data storage will be blocking issues until they are acknowledged and appropriately addressed. Many of the more complex scenarios, like customer/citizen relationship management and inter-department collaboration will depend on a government's ability to get the basics right.
    • Know your current environment. An inventory of the current environment should provide a good indication of whether systems contain sensitive data, including taxpayers' personally identifiable information and/or mission-critical data and (legacy) applications. This will provide an excellent start to planning for cloud services adoption.