David D’Agostino’s answer to Are there any resources to read about licensing & Copyright issues involved in media services like Last.fm, Grooveshark etc.?

For interactive streaming services, content is licensed from the Sound Recording Copyright Owner (SRCO), these are "voluntary" licenses. For satellite or internet radio stations, "statutory" (or compulsory) rates apply.

The Harry Fox Agency (HFA) has information on digital licensing:
http://harryfox.com/public/Digit…

SoundExchange collects and distributes royalties from statutory licenses, including:

– Digital cable and satellite television services (Music Choice and Muzak)
– Non-interactive “webcasters” (including original programmers and
retransmissions of FCC-licensed radio stations by aggregators)
– Satellite radio services (XM and SIRIUS)

http://soundexchange.com/categor…

The Copyright Office has general information on copyrights:
http://www.copyright.gov/help/faq/

The RIAA explains the differences between voluntary and statutory licenses here:
http://www.riaa.com/whatwedo.php…

Voluntary License

Most of the time, licenses are granted voluntarily by copyright owners for a negotiated fee and pursuant to agreed upon terms and conditions. These are called voluntary (or direct) licenses. Licenses usually take the form of a written
contract that specifies the owner of the copyright, what rights are
being granted, the term of the license, and the royalties, if any, to be
paid the copyright owner…

. . . Offering a jukebox on the Internet. Interactive services do not
qualify for a statutory license. Instead, such operators must obtain
performance licenses from individual copyright owners, just like other
webcasting services. Interactive services include those that permit a
listener to choose a particular song and those that create a
personalized program for the listener. If copies were being made into the computer server, operators would need to negotiate reproduction
rights also.

Statutory Licenses

The U.S. Congress has determined that, in certain limited circumstances and for public policy reasons, the government should determine the terms, conditions, and rates for a limited class of copyright licenses. For example, such a government created license may enable licensees to avoid entering into separate negotiations with numerous individual copyright holders, and
thus create efficiencies that benefit society as a whole. Such licenses
are called statutory (or compulsory) licenses, and generally the fee in
such situations is paid according to a rate set by law, called a
"statutory rate."

In the music world, some types of performance and reproductions of sound recordings qualify for a statutory license. The most common type of use covered by these statutory licenses is for non-interactive webcasting or Internet radio. The sound recordings that you might hear through a satellite system in your car, or at home over your digital cable service, also are provided pursuant to a statutory license.

Are there any resources to read about licensing & Copyright issues involved in media services like Last.fm, Grooveshark etc.?

Posted in IT | Leave a comment

Is hipster a really bad brand name?…

Is hipster a really bad brand name?
http://eu.techcrunch.com/2011/01… Write an answer on Quora

Is hipster a really bad brand name?
http://eu.techcrunch.com/2011/01/18/hipster-is-location-based-qa-call-it-quora-for-where-you-are/

Posted in IT | Leave a comment

How do I connect an iPhone 4 to a stereo?

How do I connect an iPhone 4 to a stereo? 2 answers on Quora

How do I connect an iPhone 4 to a stereo?

Posted in IT | Leave a comment

iPhone 4: Can I charge my iPhone 4 with my iPad’s charger?

iPhone 4: Can I charge my iPhone 4 with my iPad's charger? 4 answers on Quora

Can I charge my iPhone 4 with my iPad's charger?

Posted in IT | Leave a comment

Deconstructing the Cloud

What the hell is Cloud Computing?

Cloud computing is not only the future of computing, it is the present and the entire past of computing…My objection is its absurdity–it’s nonsense … It’s not water vapor. It’s a computer attached to a network!

~ Larry Ellison, Oracle CEO

Definition(s) of Cloud Computing

…a style of computing where scalable and elastic IT capabilities are delivered as a service to consumers using Internet technologies.

~ Gartner: Key Issues in Cloud Computing

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

~ The NIST Definition of Cloud Computing, Version 15

Why Larry Ellison Hates Cloud Computing

What’s new with Cloud Computing?

Larry makes some good points about the components of cloud computing having been around for decades.  So is there anything new?

First, let’s recognize the convergence of IT trends and enabling technologies that set the stage for rise of the cloud.

1998 – 2008: Convergence

Technology Trends

  • Network and Data Center Buildout
  • Ubiquitous Broadband
  • Mobile Access (Wi-Fi, 3G, Air Cards)
  • Mobile Devices (BlackBerry, iPhone, SmartPhone, Netbooks)

Infrastructure Technologies

  • Virtualization
  • Blade Servers

The chart below shows interest in cloud computing over time as measured by Google searches.  Starting from zero, a blip of interest first surfaces in 2007, the climb begins in 2008, and peaks in March 2010.  As Gartner would say, our current hype cycle position is at the peak of inflated expectations.

So what’s responsible for the dramatic rise in interest over the past two years?  I believe there are three main drivers; lower costs, market offerings, and marketing hype.

2009 – 2010: Tipping Point

  1. Lower Costs (for everything: hardware, software, storage, memory, bandwidth, etc.)
  2. Market Offerings and Opportunities
  • Microsoft Online, Google Enterprise Apps
  • Microsoft Azure, Amazon EC2 (Elastic Compute Cloud) & AWS (Amazon Web Services), Google App Engine

3. Marketing Hype!

While marketing hype has certainly played a role, there are also some new service offerings and opportunities that have been enabled by declining costs and economies of scale made possible through virtualization technologies.  The existence of these offerings, and the shift in their pricing from competitive to compelling, represent what’s new.

For example, when we evaluated our email options in 2007, we could let it remain on-premise or turn it over to an outsourcing vendor who would host it on dedicated hardware.  For almost twice as much as it cost to run internally.

Two years later, we did the same evaluation after Microsoft announced their Exchange Online offering.  By this time, dedicated outsourcing was marginally less expensive than running on premise, while Exchange Online from Microsoft cost less than half as much.  Then Microsoft lost 30,000 email seats for the city of Los Angeles to Google, and they cut their price in half (to $5 per month) while increasing storage five-fold (to 25 GB).

The relative cost of outsourced email had dropped from 191% of the on premise cost in 2007 to 20% in 2009: nearly an order of magnitude decrease in cost.

Relative Cost of Outsourced Email vs. On Premise: 2007 -2009

What Makes it “Cloudy”?

Besides the price, what is it about Exchange Online in 2009 that makes it different from the 2007 Outsourced Exchange offering?

Going back to the NIST definition, Exchange Online is provided from a shared pool of resources.  There is no additional capital investment required to add our company to the pool.  You can add or remove users through an online administration console, and your monthly bill will reflect the number of active users.  All data will replicated to a second site, and there won’t be any charges to upgrade.

In contrast, the outsourced offering does require a capital investment by the service provider, and that cost is reflected in the bill, even if the invoice lists only per seat charges.  To add or remove a user, you may need to submit a ticket to the service provider, and wait until they confirm the task is complete.

If you make an acquisition and the size and number of mailboxes exceeds the initial capacity, additional time and expense may be required to accommodate the new users.  Depending on the agreement, backups and upgrades may incur additional costs, and the SLAs may not be able to match those of the online offering.

Of course it’s not all one-sided; there are some advantages to dedicated hosting, especially when it comes to customization.  In any case, there are enough differences between what came before (hosted email) and the new offerings (cloud-based email) that most people will agree that there is something new, even if they debate over what to label it.

What about the private cloud?

Even the cloud crowd can’t agree on this one.  Without being too cynical, it’s easy to see why some companies (or even divisions or individuals within a single company) might have differing opinions.

Cloud Service Providers – There is no such thing as a private cloud.  It’s only a cloud service if you buy it from us!

Hardware / Software Vendors – Of course there can be private clouds, just like there are private intranets and a public internet.  And we can sell you a cloud operating system and some cloud hardware to build your own!

Larry Ellison – What Cloud?  It’s a computer attached to a network!

Some vendors play on both sides, giving them an incentive to be more flexible with their definitions.  And their actual statements (with one exception) are a bit more nuanced.   In “Cloud Computing – the next evolution or another dot.com?” Balakrishna Narasimhan attempts to translate vendor statements from marketing to meaning.

Company What they say What they really mean
IBM “Private” clouds offer many of the same benefits as “public” clouds but are managed within the organization. These types of clouds are not burdened by network bandwidth and availability issues or potential security exposures that may be associated with public clouds. Private clouds can offer the provider and user greater control, security and resilience. Cloud computing is a better datacenter
HP Cloud research is focused on delivering an application and computing end-state of Everything-as-a-Service: billions of users, accessing millions of services, through thousands of service providers, over millions of servers, processing exabytes of data, delivered through terabytes of network traffic. Cloud computing means more hardware and networking
Oracle “We’ve redefined ‘cloud computing’ to include everything we currently do. So it has already achieved dominance in the industry. I can’t think of anything that isn’t cloud computing.” Cloud computing is nothing new
SAP “the integration of on-site and off-site software on the vendor’s “loosely coupled, asynchronous” SOA platform” Cloud computing is a better Enterprise SOA
Microsoft “The future is a combination of local software and Internet services interacting with one another. Software makes services better and services make software better. And by bringing together the best of both worlds, we maximize choice, flexibility and capabilities for our customers. We describe this evolutionary path in our industry as Software + Services.” Cloud computing is desktop software, enhanced with internet-delivered data and access
Google “It starts with the premise that the data services and architecture should be on servers. We call it cloud computing – they should be in a ‘cloud’ somewhere. And that if you have the right kind of browser or the right kind of access, it doesn’t matter whether you have a PC or a Mac or a mobile phone or a BlackBerry or what have you – or new devices still to be developed – you can get access to the cloud” Cloud computing is internet-enabled apps on a massively scaleable platform
Salesforce “Cloud computing offers almost unlimited computing power and collaboration at a massive scale. With Force.com Platform-as-Service, we are providing the necessary building blocks to make cloud computing real for the enterprise.” Cloud computing is SaaS + PaaS
Amazon “cloud computing is that you can have all the resources that you want, could be storage, compute, networking, with an infinite amount of capacity, available to you to use on the internet, the only thing you need to use it is a credit card” Cloud computing is raw computing power, storage and networking as a service

[Via Appirio]

With the tagline “Accelerating Enterprise Adoption of the Cloud”, Appirio also has some enlightened self-interest at stake in these definitions.  From the same article, what they say:

Here at Appirio, we are dedicated to helping companies do more with cloud computing. That’s why we partner with companies like Salesforce, Google, Amazon, and Facebook, who are truly delivering on the promise of cloud computing. We help our clients steer clear of near-cloud concepts like “private cloud” and “software + service” because we believe they mitigate or eliminate many of the benefits of cloud computing.

What they mean: “Buy what we (re-)sell — the other stuff’s no good.”

I think focusing on the benefits is the right approach, but again, it’s not all one-sided.  Many of the benefits claimed for cloud computing can be realized internally by using server virtualization, shared storage, and off-site replication.  Not everything belongs in the public cloud, and in some cases an internal (or “private”) approach may be the best solution.

Whether the internal approach qualifies as “cloud computing” or not is another subject for debate.  Dave Giroud, President of Google Enterprise, is not a believer in the private cloud.  At the Google Atmosphere conference last month, he said the cloud computing term had been co-opted by non-cloud vendors.  His litmus test: someone else incurs a lot of CAPEX (capital expenditures) so you can spend a little OPEX (operating expenditures).  Otherwise, it’s not cloud computing.

A Few Words from the Believers

If you really want to believe, there are numerous evangelists with their own private cloud texts to follow.  We will look at three: NIST, Gartner, and VMware.

What NIST Believes

The NIST cloud model is composed of five essential characteristics, three service models, and four deployment models.

5 Characteristics

• On-demand self-service
• Broad Network Access
• Resource Pooling
• Rapid Elasticity
• Measured Service

3 Service Models

• Software as a Service
• Platform as a Service
• Infrastructure as a Service

4 Deployment Models

• Private Cloud
• Community Cloud
• Public Cloud
• Hybrid Cloud

As NIST defines the private cloud: “The cloud infrastructure is operated solely for an organization. It may be managed by the organization or a third party and may exist on premise or off premise.”

Allowing the infrastructure to exist on premise and be managed by the organization seems to contradict or negate some of the benefits and essential characteristics of cloud computing.  Except for the largest and most diversified of companies, supporting multi-tenant models and the appearance of “unlimited provisioning capabilities” will pose some challenges.

Even if those challenges are met, it is difficult to discern meaningful differences between the on-premise, company-managed version of a private cloud and an internal data center using virtualization, off-site replication, self-service, and chargeback technologies.  Is that all it is?

What Gartner Believes

Gartner proposes that to be called a cloud service, a solution must adhere to “some combination” of these attributes:

5 Attributes

• Service-Based
• Scalable and Elastic
• Shared
• Metered by Use
• Uses Internet Technologies

Allowing “some combination” of these attributes provides a lot of wiggle room.  They then note two characteristics that distinguish private cloud computing:

Limited Membership (that is, exclusive membership): A private cloud
implementation has a bounded membership that is exclusive

Spectrum of Control/Ownership: A private cloud service is different from a public
cloud service in that private cloud services are implemented for an exclusive set of
consumers. There is a spectrum from fully private services to fully public services that
blurs distinctions of ownership or control.

Gartner defines private cloud computing as “a style of computing where scalable and elastic IT-enabled capabilities are delivered as a service to internal customers using Internet technologies.”

This is very similar to their general definition of cloud computing that appears in the first section of this article: “consumers” has been replaced by “internal customers”.   We should note at least two changes to Gartner’s general definition over time: 1) “consumers” was originally “external customers”; and 2) “scalable and elastic” originally appeared as “massively scalable”.  The changes seem primarily designed to allow for the construct of a private cloud.

What VMware Believes

VMware distinguishes between Internal Clouds (on premise) and Private Clouds: “Cloud infrastructure can reside within the company’s datacenters (as internal clouds or on-premise solutions) or externally on the Internet (via external clouds or off-premise solutions). It encompasses any, per-unit-accountable, subscription-based or pay-per-use service that extends IT’s existing capabilities.”

They also specify three use cases that map pretty closely to the 3 NIST Service Models.  (The NIST models are in parentheses following the VMware cloud types.)

  1. Application and Information Clouds (Cloud Software as a Service) – e.g., Salesforce.com, Google Apps.
  2. Development Clouds (Cloud Platform as a Service) –  e.g., Amazon EC2, Google App Engine
  3. Infrastructure Clouds (Cloud Infrastructure as a Service) – e.g., Amazon Web Services, Hosting.com

VMware views the Private Cloud as an off-premise extension of an internal cloud.

VMware identifies Eight Key Ingredients for Building an Internal Cloud

  1. Shared Infrastructure
  2. Self-Service Automated Portal
  3. Scalable
  4. Rich Application Container
  5. Programmatic Control
  6. 100% Virtual Hardware Abstraction
  7. Strong Multi-Tenancy
  8. Chargeback

Not surprisingly, VMware sells these ingredients wrapped within and around what they call “The World’s first Cloud Operating System” (vSphere 4).

What I Believe

There is an important set of benefits commonly ascribed to cloud computing that are a natural result of implementing the computing model described by NIST.  Some of the solutions brought to market in past two or three years deliver many of these benefits and can be substantially differentiated from their predecessors.

Cloud Computing Benefits

  1. Cost containment—The cloud offers enterprises the option of scalability without the serious financial commitments required for infrastructure purchase and maintenance.
  2. Immediacy—Many early adopters of cloud computing have cited the ability to provision and utilize a service in a single day.
  3. Availability—Cloud providers have the infrastructure and bandwidth to accommodate business requirements for high speed access, storage and applications. As these providers often have redundant paths, the opportunity for load balancing exists to ensure that systems are not overloaded and services delayed.
  4. Scalability—With unconstrained capacity, cloud services offer increased flexibility and scalability for evolving IT needs.
  5. Efficiency—Reallocating information management operational activities to the cloud offers businesses a unique opportunity to focus efforts on innovation and research and development.
  6. Resiliency – Cloud providers have mirrored solutions that can be utilized in a disaster scenario as well as for load-balancing traffic.

    ~ ISACA: “Cloud Computing: Business Benefits with Security, Governance, and Assurance Perspective”

It is very difficult to see how an infrastructure that is on-premise and operated by the organization’s own IT department could deliver on most of these benefits.  Some would be impossible to achieve through an “internal cloud”, and others could be realized only at extraordinary cost.  In which case, why do it?

It is also important to recognize risks that are unique to cloud computing.  Gartner identifies seven issues customers should raise with prospective cloud service vendors.

Cloud Computing Risks

  1. Privileged user access. Sensitive data processed outside the enterprise brings with it an inherent level of risk, because outsourced services bypass the “physical, logical and personnel controls” IT shops exert over in-house programs.
  2. Regulatory compliance. Customers are ultimately responsible for the security and integrity of their own data, even when it is held by a service provider.
  3. Data location. When you use the cloud, you probably won’t know exactly where your data is hosted. In fact, you might not even know what country it will be stored in.
  4. Data segregation. Data in the cloud is typically in a shared environment alongside data from other customers.
  5. Recovery. Even if you don’t know where your data is, a cloud provider should tell you what will happen to your data and service in case of a disaster.
  6. Investigative support. Investigating inappropriate or illegal activity may be impossible in cloud computing.
  7. Long-term [vendor] viability.

    ~ InfoWorld: “Gartner: Seven cloud-computing security risks”

Considering cloud computing from a risk perspective, it does seem that a private cloud could mitigate many of these risks.  For example, take a look at Amazon’s Virtual Private Cloud offering.  The same holds true for a community cloud such as the Government Clouds that Amazon, Google and Microsoft are racing to build.

The benefits may be diluted, but as long as the infrastructure is built on a third-party platform and the management and operations responsibilities are segregated appropriately, benefits can still be had.  The task would be to properly balance the benefits with the risk management needs of an organization.

If we reject the construct of an internal cloud but accept the notion of a private cloud, we also make room for the hybrid cloud.  This would be where the private or community cloud interfaces with the public cloud.

So a risk/benefit approach to defining cloud computing allows for some flavor of all four NIST deployment models.  What it doesn’t support is a private cloud of the “internal cloud” variety — one that is owned and operated on-premise by a single organization.

While that may be satisfying from a semantic perspective, it is not all that helpful in the real world.  The marketing genie is out of the bottle, and no number of YouTube rants will put her back in.  This holds for cloud deniers of all stripes.

“Cloud is a euphemism for an abstraction.”

~ Gartner Cloud Computing Workshop

So instead of fighting over definitions, it makes more sense to keep the conversation focused on benefits and risks.  When a vendor raises the cloud banner, ask which of the benefits will be realized and to what extent.

For example, how “elastic” is the pricing? If your need for a service drops to zero tomorrow or next month, will the billing stop?  Or do they require a multi-year contract with minimum annual commitments?

And how will they mitigate the risks?  To make an informed decision, you need to go in knowing what benefits you’re after, and which risks are most important for you to avoid.

Many current vendor offerings are in beta or limited beta, so there are likely risks that have yet to surface.  This meta-risk (the risk of not knowing what the true risks are) should be part of the equation, especially for mission-critical applications.

It really doesn’t matter if a proposed solution meets the requirements of evolving and conflicting definitions of the cloud, but it helps to be knowledgeable about the debate.  That way you can clear away some of the smoke and focus on what does matter: does it deliver the desired business benefits, at a fair price, with acceptable risk.

Posted in IT | Tagged , , , , , , , | Leave a comment

Midmarket CIO Forum: March 14 – 16, 2010

This post is intended to document some of the product, service, and peer benchmarking information taken away from the Midmarket CIO Forum.  The event was run by IT Business Edge and had 15 vendors hosting 80 IT Executives.  Information was exchanged in various settings:

  • Boardroom Case Studies: Vendors presenting to a group of 12 IT Executives
  • Luminary Presentations: Speakers presenting the entire group
  • Vendor Roundtables: Vendors presenting to a group of IT Executives interested in a specific topic
  • Vendor One-on-Ones – scheduled or ad-hoc meetings with individual vendors
  • Peer-to-Peer Roundtables: discussion among IT Executives interested in a specific topic moderated by IT Executives
  • Informal Peer-to-Peer exchanges

Virtualization and Cloud computing seemed to be at the top of the list for many participants and vendors.  Not surprising, as Gartner’s 2010 CIO Agenda survey of more than 900 CIOs listed them as the No. 1 and No. 2 tech initiatives for 2010 (they didn’t appear in the list at all for 2009).

Server Virtualization

We seem to be a little ahead of most groups in our server virtualization efforts.  In our boardroom, when asked who had virtualized more than 40% of their production environment, the majority of the group (12 IT Executives) raised their hands.  Some were halfway there or just finishing up their projects.

Two companies were using Hyper-V for virtualization, 10 were using VMware.  Most companies were using Exchange, a few had virtualized Exchange 2007 and were happy with the results.

Desktop Virtualization

Desktop virtualization was a different story.  Although a fair number of companies were using application virtualization (mostly with VMware ThinApp) and enjoyed the benefits, very few were using Desktop Virtualization in a production environment.

Three CIOs I spoke with were either starting or finishing up what they called their “last” desktop refresh project.  Two of them plan to host Virtual Desktops on ESX Servers using VMware View, and access through thin clients such as Wyse or “Zero Clients” from Pano Logic.

The Pano box is an endpoint device for virtual desktop computing that contains no processors, no operating system, no memory, no drivers, no software and no moving parts. Pano Logic has a starter kit with 5 devices available for $1,899.

Pano Cubes

Pano Logic was the hardware winner and VMware View 4 was the Clear Choice Test winner for NetworkWorld’s VDI Shootout last year.

The third person contemplating the end of the desktop refresh was the CIO of  mid-sized city.  Having just moved from Lotus Notes to Google Mail, he is thinking about weaning his users from Microsoft Office to Google Apps, and wonders if the Chrome OS and a browser will be all he needs by the time the next refresh cycle comes up.

Cloud Computing

I moderated a peer-to-peer roundtable on cloud computing at the Forum’s first event on Sunday.  While there was the expected amount of confusion over what cloud computing actually is, there was some agreement on what it isn’t.

Most participants felt that traditional hosting and outsourcing did not qualify as cloud computing, since they could leave you with the same basic infrastructure and vulnerabilities (e.g., single point of failure).  The only difference is that the potential problems are located off-site and managed by someone else.

There was even more puzzlement over the concept of a “private cloud”.  I will examine this topic further in a later post, but the consensus at the roundtable was that there are real benefits to some aspects of cloud computing, and it’s more important to realize those benefits in a way that makes sense for the business than to worry about meeting the particular requirements of evolving definitions.  (See Why Larry Ellison hates Cloud computing.)

From a practical perspective, a lot of companies are talking about cloud computing, but other than email filtering, very few are actually doing anything with it.  The one exception at our table was the city CIO (see above), who seems ready to go all in with the cloud crowd.  He was even questioning whether he needs to bother with Windows 7 at all, since a browser may be all the city needs to conduct business in a year or so.

Windows 7

Rob Enderle gave a Luminary presentation on Windows 7, and had an interesting backstory on the internal dynamics at Microsoft that produced the “train wreck” that was Vista.

Only one person in the audience had rolled out Vista in production.  Most were using Windows 7 within IT and with pilot groups.  Rob’s opinion was that a big-bang rollout was best, but that nobody can afford it, so most upgrades will be done incrementally.

Rob’s opinion of Windows 7 was very favorable, which matches what I heard from most attendees.  The Office 2010 Beta received high marks as well.  Here are the tips and best practices presented by Rob:

  • Best to rollout with new hardware
  • Deploy by group to avoid PC Envy
  • Most hardware these days is built to last 3 years with a cushion to avoid in-warranty repairs; don’t hang on or delay upgrades too long
  • Consider employee purchased HW connecting to virtual desktops
  • Consider SSD for notebooks (heard a lot of this one)
  • Transition from 32-bit to 64-bit; next platform will be pure 64-bit
  • In addition to improved performance and memory usage, 64-bit machines are more stable; some viruses can’t execute in 64-bit machines
  • Get Matched Memory for best performance (with dual-channel architecture) and to avoid problems
  • Most PC problems are memory problems, but present in different ways
  • Most everything built after 2008 has matched memory (same size, speed, vendor)
  • If problems arise with older machines, replace single or unmatched DIMMs with matched memory

Miscellaneous T. Below are vendors I spoke with at the conference, most made presentations to our boardroom.

Tango/04 – Pretty slick BPM, Data, SLA,  and Infrastructure Monitoring package.  Has solutions configured specifically for compliance and for IBM iSeries data monitoring.

Conclusion: More than we need internally.  Requires an estimated 1 week of professional services for basic infrastructure monitoring, and 4 weeks for BPM.  Ongoing administration requires Python programming expertise.

Sanbolic – Interesting product that sits between servers and storage and virtualizes your SAN with a clustered file system and clustered volume manager.  Designed for use with Windows Servers, it allows you to aggregate multiple storage arrays into a single pool of storage.  Uses de-duplication and replication to reduce storage needs and provide active-active SQL Server clusters.

Has solutions for Citrix and VMware, but focuses most heavily on Hyper-V.  $16K for base license covering 2 physical servers and 8 virtual servers.  $400 for each additional virtual server.  No training or professional services required.

Conclusion: Seems geared primarily to fill gaps in Hyper-V implementations.  We already get many of the benefits it offers through features of VMware and EMC, and will get more with Avamar.

Message Labs – Provides hosted email, IM, and web security, recently acquired by Symantec.  Provides 100% SLA for Anti-Virus solution.  Plans to offer Hosted Endpoint Protection.

Asked about package deals with Symantec Endpoint Protection.  Currently no bundled offerings, they are working on it.  Client level protection may always be needed to protect from internal threats, but hosted solution may be better for external threats.  They are planning to offer client software with a smaller footprint than Symantec, designed to be used in conjunction with hosted solution.

Conclusion: Message Labs was included in evaluation when message filtering was moved from on-premise to cloud.  Product was good but relatively expensive.  Email filtering, Continuity, and Discovery products are most mature.Would like to evaluate current security tools, including Cisco ASA modules, Symantec Endpoint Protection, and Postini, and see if there is an opportunity for consolidation.

Cast Iron Systems – The last time I saw this company at the Gartner Midsize Enterprise Summit, their offering was a fairly expensive appliance for on-site enterprise application integration.  They now call themselves “The #1 Cloud Integration Company” and offer their solution through a physical or virtual appliance, or as a hosted service.

They have a pretty impressive list of partners, and connectors to allow for point-and-click integration between and among both on-premise and cloud-based applications.  In the cloud, they provide integration services for Microsoft Azure, Amazon EC2, and Google Apps.

They have moved to a subscription pricing model, and charge $250 per month for endpoint.  An endpoint can be an application, database, or other datasource such as Active Directory.  Multiple integrations can be created between endpoints with no additional subscription charges.

They will perform a no-charge scoping and analysis for proposed integration projects.  They suggest that a company’s first integration project be done together, and can be completed in an average of 8 days.  Training is offered so that most future integrations can be done internally, and integrations are shared freely within a customer community.

Conclusion: Integration with ADP has been, and remains a challenge for us, mainly because there is no test environment for us to work with.  Cast Iron Systems has a formal relationship with ADP, and plans to build a connector to their systems.  They are also in talks with Ultimate Software.  If they can provide connectors for those systems, we should consider a scoping and analysis session.

Google Enterprise – Google’s boardroom presentation focused mainly on Google Apps, with a bit of time devoted to Enterprise Search.    On the Apps front, the big news is the Google Apps Marketplace, which launched March 9, 2010.  Since I’ve covered Google Apps in previous evaluations, I won’t repeat those findings here.

One offering that may be of interest to us is Google Web Security for Enterprise.  This is an add-on to Postini that provides real-time scanning of web requests to protect against spyware, viruses and malware.  It also provides content filtering to manage, monitor, and report on internet usage.

Google Web Security

The Google Search Appliance (GSA) is licensed in  two or three year contracts, with pricing based on the number of documents crawled.  Pricing for up to 500,000 documents would be $30,000 for two years.

Google claims that the GSA can be deployed in 2 to 3 days, versus 2 to 3 months for a FAST implementation.  The GSA is delivered with a SharePoint connector.  Google mini is limited to 300,000 documents, and will not index SharePoint, only file shares.

Conclusion: Microsoft has two flavors of Enterprise Search (one from their FAST acquisition) that will be available with SharePoint 2010.  We will wait to see what’s available from Microsoft and evaluate before looking at other search products.

Considering our current tools and user habits, if we were to consider a hosted productivity suite, Microsoft BPOS would be a more likely candidate than Google Apps.

We may want to evaluate Google Web Security as part of a consolidation / rationalization of security tools as mentioned in the conclusion for Message Labs.

TriGeo – TriGeo sells a Security Information and Event Management (SIEM) appliance that provides real-time, in memory log analysis, event correlation, and active response/threat mitigation.  The device also includes a bundled Intrusion Detection System (pre-configured Snort), and USB detection and prevention to protect against data breaches.  TriGeo has a good reputation as a regulatory compliance solution, and bundles over 300 reports and out-of-the-box compliance packs.

Conclusion: TriGeo seems to show up at every Midmarket conference I attend, and there usually a few attendees who use it and give it high marks.  Most TriGeo deals are in the $30-40K range, and some of its functionality is provided by products currently in use here.  Might consider in a green field situation, or where there is an urgent need for compliance with PCI, GLBA, FDIC, NCUA, NERC-CIP, HIPAA, SOX, FISMA, ISO 1799/27001/27002, or other regulations.

SpectorSoft – Spector 360 monitors web sites visited, emails sent and received, chats and instant messages, keystrokes typed, files transferred, documents printed and applications run.  For web usage, it monitors active time, focus time, total time.  It can record the user’s desktop with snapshots at rates up to one per second (default is every 30 seconds).

Prices for the Network Edition start at $1,995 for 15 PCs.  There is also a more targeted version to monitor individual PCs, Spector CNE Investigator, starting at $495 for a 3 PC license.

Conclusion: One person in our group was using Spector 360 and was happy with it.  Most everyone else felt it was too intrusive and were shaking their heads after the vendor left the room.  Probably overkill for us, but may be of interest to our more security-minded clients.

Miscellaneous C.

In addition to the vendor presentations, the boardrooms had working sessions where we discussed topics of our choosing.  The items below are drawn from those conversations.

  • Some are using Cisco or MS Certificates to control access to their Corporate Network (Wired or Unwired)
  • 4 of 12 don’t allow Facebook due to Malware; one other limits usage to 1 hour per day
  • Wireless Device Management – 10 companies pay for device and plan; 1 employee provides device, receives stipend for plan; 1 employee pays for everything
  • A few companies are now using Netbooks instead of aircards
  • Some are also using Netbooks as loaners
    That’s it.
Posted in IT | 2 Comments