Data Entry Environment (EDC)
The UBC proprietary Electronic Data Capture system offers industry leading configurability and customization to meet the unique requirements for each study requiring de novo data capture.
Efficient capture and cleaning of data in a fully validated environment.
End user perspective incorporated into design.
Bi directional edit check logic, dynamic branching, and highly configurable workflows are several of the hallmark features that come together to create an optimized user experience that minimizes burden on health care providers supporting manual data entry.
Data Warehouse
Master Data /
Data Cleaning
When cleaning or master data conditions arise Mosaic will automatically correct and send notice of the correction when possible or will engage event engine to initiate a notice or workflow if human interaction is required.
By default Mosaic has standard cleaning and master data rules that can be used by default out of the box for a program, but these rules can be customized to meet a programs specific requirements.
Mosaic contains process engines that actively evaluate data to proactively locate errors in data and align the data to defined master standards.
Event
Processing
Mosaic contains an intelligent workflow automation engine that integrates with UBC core systems and offers a standard interface to exchange information with partners.
Events processing can actively monitor for specific data occurrences or event triggers. Once fired the event processing can engage a new workflow, pull additional data, build a data set, and fire a notification to name a few examples.
Standard
Model
Designed comprehensive and flexible models that meet the needs of specific areas. Models are based on standard industry models.
Standard models are configured and extended to meet the specific needs of a program to ensure the data collected and provided aligns with the defined parameters of the program.
On top of providing a standard model to define data points, measurement scales, and data formats Moasic also implements a set of standard policies to define and control data access and usage standards.
Eliminate burden on provider to meet model standards by receiving data as a in any format and field size instead of making the suppliers meet our standards. Mosaic handles the mapping and transforms to align data with the program’s models.
Data
Processing
If a request is received to remove a person and their transactional data from a program Mosaic has the ability to perform data removal to the exact standard defined by the request including removal of data from the original source files in archvie.
Data process and structures designed to provide full traceability and history retention on every data point in warehouse.
Archive files are stored in a manner so if reprocessing of data is required it can be performed in the exact same order as original order.
All entities and transactions are assigned the unique master token id to ensure linkage capabilities across programs as well as real world data.
Mosaic provides the ability for receipt and delivery data in any format, from any provider, at any time, and any size. These processes can be batched on a defined schedule or real time exchange.
Reporting & Analytics
Intuitive Interface
User navigation designed to get to insights quickly with best-in-class visualizations.
Export Capabilities
Multiple data and chart export options including PDF, Excel, and Powerpoint.
Data Refresh
Data is updated up to hourly by default with a focus on speed and performance.
User Security
Named-user licensing and credentials assigned to role-based user groups.
Email Subscriptions
Create custom email subscriptions to deliver data to your Inbox or share with other licensed users.
Fast Rendering
Data modeling and query processing optimized to render visualizations for, and filter data, quickly.