Search This Blog

Monday, October 18, 2010

Best Practices for Validation of a Software as a Service (SaaS) Customer Relationship Management (CRM) Solution - Part 2

Part 2: Best Practices for the Validation of a SaaS Application
Written by Gregg Mauriello - Validation Manager, QPharma

In Part 1, I gave an overview of what Cloud Computing and SaaS are, and I promised more information in the coming months. Part 2 is here, and now I can discuss the best practices for the validation of a SaaS application.   I will look at the methodology for the validation of a SaaS CRM Application and compare it to the conventional methodology for the validation of a CRM Application. 

Since all customers utilize the same instance of a SaaS application, the core functionality of the application can be validated just once.  Also, the vendor may perform a baseline configuration of the application to meet best practices for a CRM application.  This baseline would also be validated just once.  The customer validation, therefore, can be limited to the customer-specific configuration of the application that deviates from the pre-validated baseline.   The validation of the core application and baseline configuration will follow the conventional methodology for validation, including installation and operational qualification.  The vendor must develop business requirements and functional specifications against which to test including the definition of the baseline configuration.  Note: The focus of validation testing will be on functionality of the system with regulatory impact.

Deliverables:
  • The Vendor Validation Plan will document the process, activities, and deliverables required for the validation of the core application and baseline configuration. 
  • An Installation Qualification will be developed to document all core software and hardware required for the application. Note: The vendor IQ will not address the field device used for the sales force as this will vary greatly by customer. 
  • An Operational Qualification will be developed to test the functionality of the core application and baseline configuration. 
  • A Performance Qualification will not be developed as the system will not be utilized in a production environment until it is deployed for a particular customer. 
  • All validation activities will be summarized in a Validation Summary Report and a Release Memo will be generated, which will document that the application has been validated and is ready for customer configuration and implementation.
All customers utilizing the application will be able to leverage the validation of the core system and baseline configuration, allowing for a lean validation to be performed by the customer.  It is recommended that the customer perform a vendor audit of the core configuration validation package to determine two factors:  
A) If the validation documentation is in line with their internal quality processes and 
B) How much validation of their specific configuration will be required.  

The vendor will also provide a service level agreement which will define the vendor’s role in maintenance and the administration of the system and data.  All procedures that are performed by the vendor, such as backup and recovery, change management (for code and infrastructure), disaster recovery, physical and logical security, system administration, and training, will be documented by the vendor.

Still interested in hearing more? I will be presenting information on this subject during an interactive workshop, at IVT’s 16th Annual Validation Week on October 26th (Session 4 on Day 2!).  The workshop will be presented along with my colleague, Elise Miner.

If you are not attending Val Week, stay tuned for Part III, where I will discuss customer responsibilities in the validation of a SaaS CRM system.

8 comments:

  1. Hi Greg,

    Very good post. As a software vendor we often pushed for these same validation requirements whether or not the software was deployed on premise or hosted in the cloud.

    Would these same validations of core/baseline functionality vs. client specific configurations hold for configurable OTS software?

    thanks,
    -Bernie

    ReplyDelete
  2. thanks for kicking this off, but allow the question whether this can be called "Best Practice" already? Have you done this may times? If so, then you are more then correct!
    Our understanding of the term is that once a methodology has proven successful in a specific domain by applying it multi times then you have a Best Practice for evolving approaches such as SaaS/Cloud.
    May I also ask they you do not include DQ? Our customers do care about Risk Management and therefore to include an audit of DQ applying TMX.
    Comments?
    Thanks and have a good day.
    Ruedi

    ReplyDelete
  3. Bernie,

    Some of the concepts mentioned here of leveraging validation of core functionality/baseline configuration can definitely be applicable for a configurable OTS system. Assuming the vendor develops a compliant validation package for core functionality the customer who implements that system can leverage the vendor work to reduce their own internal validation burden. The vendor would of course need to provide (or sell) a copy of the executed validation package to the customer for this to be applicable. There are some key differences between the two models:

    1. In this SaaS model, everyone is utilizing the same instance of the backend application on common infrastructure so the majority of the IQ is done by the hosting vendor. For the OTS model, each customer will be executing a full IQ themselves.

    2. In both scenarios, the customer gets most leverage out of the functional testing (typically done as an OQ). In the SaaS model, everyone uses the same application so the functional testing is 100% applicable for everyone. For the OTS model, although the vendor testing was done on the core functionality of exactly the same application, the customer environment may be slightly different or it may not be installed exactly the same way. For this reason you would likely want to do some OQ regression testing or create a more robust PQ to ensure everything is working correctly.

    3. In the SaaS scenario, change control for core system functionality (and associated validation) remains with the hosting vendor. For the OTS scenario, change control to the core application will typically be the responsibility of the customer as patches/upgrades are released.

    4. If you have a highly configurable system (something more akin to a toolbox) vendor validation may begin to decrease in value as you will still need to robustly test your complex configuration.

    Thanks,
    Gregg

    ReplyDelete
  4. Reudi,

    We have been validating SaaS CRM applications for almost 3 years now (compounded with over 10 years of validating traditional CRM applications) so we feel we have established a good handle on the best practices for this type of application and model.

    In regard to Design Qualification (DQ), we typically leave this outside the vendor’s core validation package. Verification of the validity of DQ and other types of SDLC quality activities are usually handled via audits of the SaaS Application and Hosting vendor(s) by the customer. Because of the service nature of the SaaS model, we believe robust vendor audits are more important here than in traditional deployment scenarios.

    Although not mentioned in this blog post, the vendor validation package does include traceability of testing to baseline functional requirements for the core application.

    Thanks,
    Gregg

    ReplyDelete
  5. I read the article and I cannot see anything there that is special for SaaS. The article simpy describes the standard (GAMP) approach for a COTS product, i.e. leverage as much vendor documentation as possible to minimise the validation effort on the customer side.

    At the beginnning of the article it says the article "...will look at the methodology for the validation of a SaaS CRM Application and compare it to the conventional methodology for the validation of a CRM Application...".

    However, I cannot see what is specific to CRM; the methodology is standard GAMP guidelines. I also cannot see a comparison with "conventional methodology".

    Also, the article keeps stating that the vendor will do this and the vendor will do that. This is a rather optimistic view of what a vendor will provide, and based on my experiences is often not realistic (at least at a cost point that is acceptable).

    If the article had described some of the real issues with compliance/validation and SaaS (e.g. remote access, data privacy and segregation, risks and escrow, open/closed system, lack of vendor documentation, etc...), there might be more value there.

    ReplyDelete
  6. Ian, I would have been surprised if the validation per se would have been any different. What is different is who is responsible for what. For a SaaS application the onus of validation shifts considerably to the vendor who is providing the service. The vendor has to provide the IQ and baseline OQs and provide full documentation for these. Likewise the vendor will need to ensure the compliant status is maintained for the components managed/controlled by him/her.

    From a client perspective as part of vendor audit then the key requirement is to verify that these exist and are to GxP standard to pass muster during a regulatory audit.

    This shift in workload, in my opinion, is the main difference between validating SaaS versus inhouse deployed COTs application.

    ReplyDelete
  7. The approach to validating SaaS and Cloud applications is just the same as any other application. The challenge is in understanding, defining and where neccesary controlling the environment in which the application is running, and mitigating additional associated risks.

    It's not really a validation issue, but a qualification issue and (as Donald Rumsfeld never said) making the 'known unknowns' and the 'unknown unknowns' known! The often promoted advantage of SaaS and Cloud Computing is that users don't need to worry about this or that - some external supplier takes care of it for you. However, validated applications need a controlled environemnt and that means a known environment.

    And if we're plugging things (and why not?) people might like to register for a webcast "Qualifying the Cloud: Fact or Fiction?" at http://www.businessdecision-lifesciences.com/TPL_CODE/TPL_AGENDA/PAR_TPL_IDENTIFIANT/279/1584-agenda.htm

    ReplyDelete
  8. We are doing validation of SaaS application and it is nothing different. We are following GAMP5 principles.

    ReplyDelete