I want to challenge a vendor on their anticipated testing effort for a proposed software delivery project, which seems way to low in my experience. The project is to migrate a dialer system to cloud, which includes a bunch of custom functionality. How much percentage of total capacity do you normally allocate to testing for such a cloud migration (test setup, execution of tests (if not automated), etc.)? I am looking for a figure like 20% of all development efforts should be allocated to testing. Any ideas?
Sort By:
Oldest
Head of Corporate Development8 months ago
At least a third, for most vendors that I have seen they will talk a big game, roll out tier 1 assets prior to contract closure and then after signing you get handed mediocre people who are literally just trying to do the bare minimum. So, yes, test test test because they will do the bare minimum and walk away as fast as they can.Director of IT in Transportation8 months ago
I have used a 20% time allocation for testing in planning large software projects, but this is meant to be an average across all tasks. For a single workload, I'd be more inclined to think about the types of testing and metrics I wanted - e.g. unit tests with 80% code coverage, 50 selenium UI tests, etc. and have the vendor estimate those. CTO in Media8 months ago
I wouldn't try and focus the time split in that language. You could have them do a provable 50/50 split on dev/test and still not get the outcomes you are seeking.My experience being on the vendor and purchasing side of efforts like this has me preferring a milestone payment schedule.
If you get the results you're after, it shouldn't matter what the % each focus area received in their building process.
If you are after a successful implementation, ensure that the project, SOW, milestones, and possible payment are tied to a functional product.
If you need robust tests to continue to use after delivery, then ensure that is a stated outcome.
Director of IT in Healthcare and Biotech8 months ago
If the vendor has a % in the contract that is too low, then by all means push for more. I've been on the receiving end of projects w/o enough testing and it's a bad experience all around. However I recommend 's results & milestones approach vs stated %time. At the end of the day it's the results that matter, not how much time a bean counter recorded per the contract. And if the progress is coming in at a lower bar, then the vendor would have to put in more testing time vs what you would have initially expected.
CIO8 months ago
What we have used for acceptance is 80% success on the test cases, but in terms of the execution we require to test all the functionalities as part of the cloud migration. We don´t put a percentage of resources or time to be allocated for testing, we just setup the number of testcases and the time to perform the full test.