Implement

IV.B.3. PCP Phase III (Monitoring): Final prototype development, including procurer adaptations and integrations and further pre-clinical validation and preliminary economic impact evaluation (if applicable)

Estimated Execution Time

6-18 months (Aligned with the contractual duration of PCP Phase III and the Monitoring Team calendar defined in steps IV.A)

Objective

To validate the performance, usability, and technology readiness level (TRL) of each contractor’s prototype under real-life operational conditions simulating actual use in the procurers’ environments, assess reliability, scalability, integration capacity, and end-user acceptability, evaluate the roadmap for certification, industrialisation, and commercialisation to ensure that validated solutions can reach the market after the PCP, and generate robust evidence to support final lessons learned and inform innovation policy, follow-up actions, and potential scale-up.

Who is Involved

Hospital / Procurer Side (Internal)

  • Monitoring Team: oversees pilot deployment, gathers technical and user data, and consolidates evaluation results.
  • Procurers / Buyers Group: provide testing environments, authorisations, and ensure pilots align with procurement objectives.
  • Healthcare Professionals / Operational Staff: end-users such as clinicians, technicians, or administrative personnel testing the solutions in real contexts.

Innovators / Providers (Internal)

  • Contractors: deploy, support, and troubleshoot their solutions during the pilot according to the established protocol.

Ecosystem & Support Actors

    • Legal / Ethical Advisors: ensure compliance with GDPR, informed consent, and other ethical frameworks when required.
    • Knowledge Partners: support data collection, impact evaluation, and analysis of pilot outcomes.

Activities / Tasks

IV.B.3.1 Establish pilot validation and monitoring set up

  • Confirm the pilot protocol, validation indicators, and performance thresholds in line with tender requirements and operational conditions.
  • Define roles, responsibilities, reporting formats, and documentation standards for Phase III monitoring to ensure consistency and traceability across sites.

IV.B.3.2 Execute pilot deployment and validation activities

  • Coordinate the deployment, integration, and day to day operation of the solutions in real life settings, ensuring infrastructure readiness and compliance conditions are met.
  • Support users and track operational performance, facilitating on-site technical assistance and incident management by contractors, and monitoring objective indicators such as uptime, reliability, task completion time, and error rates.
  • Gather and analyse user feedback, collecting structured input through surveys, usability scales, or interviews, and evaluating user satisfaction, acceptability, and alignment with workflow needs.

IV.B.3.3 Consolidate results and finalise Phase III monitoring outputs

  • Evaluate pilot outcomes and scalability potential by conducting final reviews per site and per contractor, assessing reliability, interoperability, integration capacity, and overall impact against the predefined criteria.
  • Examine each contractor roadmap for certification, industrialisation, and commercial readiness, identifying gaps, risks, and realistic next steps towards market deployment after the PCP.
  • Consolidate results and lessons learnt by organising joint debrief sessions to validate findings and by feeding evidence into the agreed monitoring outputs, including comparative analysis, end user feedback summary, and lessons learnt and policy implications.

Tips / Common Pitfalls

Involve end-users from the start of pilot planning. Ensure their engagement and the relevance of testing scenarios.

Engage and train end-users before pilot deployment. Help them focus on core functionalities relevant for development.

Assess the roadmap for certification and commercialisation. Identify gaps in readiness and opportunities for scale-up or follow-up support.

Negotiate with contractors an extended use of the solution beyond project lifetime. Enable broader validation, generate stronger evidence for buyers, and support contractors’ certification progress.

Exclude real end-users from validation. This limits relevance and reliability of feedback. But don’t engage them before reliability has been assessed.

Impose unnecessary technical or integration requirements. Focus pilots on demonstrating essential functionality and interoperability.

Underestimate preparation and approval timelines. Plan buffer time for potential delays in ethical or medical device committee reviews, and for tasks such as recruiting users, configuring and adapting solutions, and setting up environments or approvals from IT, legal, or ethics boards.

Ignore interoperability and real operational constraints. Such oversights can compromise the pilot’s credibility, usability, and future scalability.

Outcome / Deliverables

  • Performance Evaluation Reports completed for each solution, presenting key indicators, incidents, and validated user feedback.
  • Comparative Analysis Report consolidating results across sites and identifying solutions ready for procurement or further development.
  • End-user Feedback Summary including satisfaction metrics, usability insights, and improvement suggestions.
  • Lessons Learnt and Policy Implications Document summarising cross-site evidence and recommendations for replication or scale-up.
  • Commercialisation Readiness Overview outlining certification progress and the next steps towards market deployment.

Need expert support for this step?

Browse Experts

Browse our directory of experts who can provide professional guidance.

Submit Feedback

Propose a Service Provider