Q&A – Deep dive into Investigate DQ with Product Manager, Tom Seel.

Journal


Tom Seel is the technology product manager at Investigate DQ and has been involved since its inception. He has worked in numerous complex data migration and data remediation projects and is passionate about all things data. 

Tom believes data standards should be paramount from board level right through to everyday administrative units. He strives to see the level of data quality improve across financial services.  

In this Q and A, Tom discusses data quality software tool Investigate DQ and how its origins came from data remediation work. 

What does Investigate DQ do?  

(TS) Investigate DQ is a software that manages the quality of all your data from multiple sources and file types all in one platform. Data held on admin platforms, advice platforms, CRM’s or any structured data file can be monitored simultaneously and reconciled against one another. This level of oversight means that data is in the best possible condition and fit-for-use across the board and costly remediation events, or regulatory breaches are prevented.   

What do Investigate DQ users have to say about the product? 

(TS) Our clients are true leaders in data quality. They have adopted best practice principles early and are reaping the benefits. We have been getting positive and constructive feedback from them and are seeing this in action as our clients are getting unheard of accolades from regulatory bodies for the exceptional quality of their data.  

We always strive to have a close personal relationship with Investigate DQ users to better understand their needs from the tool. This results in valuable input that makes its way back into the software. Investigate DQ is fast becoming a highly tuned data quality tool, thanks to user feedback. 

Why was Investigate DQ designed? 

(TS) QMV found itself involved in numerous large-scale and complex remediation projects. Those experiences highlighted the importance of data quality and the need for a rigorous system to identify, track and measure data quality before it became a remediation task.  

We had a real need to fulfil and drove our own requirements using any technology we had available. We recognised the value of our designs and invested in creating an end-to-end product that we could deploy and re-use in future work. We had the foresight to make all the aspects of it that are financial services specific completely configurable and modular, so it can be adapted to any industry. 

How has the financial services market responded to lifting the standard in data quality? 

(TS) Over time we have seen a real shift in recognising the importance of data quality having seen first-hand the consequences across all levels with millions spent by financial institutions to correct errors and fill system gaps, the impact, to customer outcomes, satisfaction, brand reputation and even resulting in regulatory breaches.  

We have noticed a shift in the focus on data quality, where this was once seen as a costly and unnecessary exercise, we are now seeing this reported as an investment in the prevention of issues. 

We really believe that a proactive approach can prevent a lot of these and although we have seen this recognised through the adoption of reg-tech we hope the message continues to be promoted through the focus from the royal commission. 

Which organisations currently make up your client base? 

(TS) We are proud to work with highly engaged and forward-thinking clients. Investigate DQ currently provides data quality assurance across several clients including the likes of Netwealth, IOOF, Aware Super, Grow, TelstraSuper, Qantas Super and Mercer. 

We think that mid-to-large enterprise organisations across different sectors such as wealth, banking, telcos, retail, and life insurance would benefit the most from using Investigate DQ. 

What is your approach to customer service?

(TS) We are open and transparent with our product development to clients. We regularly check in with them to see if we can meet their needs better through the product and encourage their engagement. Our team likes going onsite and meeting the end-users which continues a long tradition at QMV to putting clients first and continuing a personal relationship-based approach. We often find clients commenting on how flexible and approachable we are when needs arise. 

What makes Investigate DQ better than competing solutions? 

(TS) Our product was designed by people with decades of knowledge experience to meet an extremely focused need to combat data quality in financial services. We knew very clearly what was needed and going to work and quickly recognised that there wasn’t anything like this available in the marketplace. 

Many of the data integrity tools we came across either required extensive and costly customisation to get started, which proved to be an additional overhead in an already complex environment or were really just “add-ons” from other products that simply didn’t meet needs effectively.  

We kept this in mind which resulted in a flexible and agile product that removed the overheads of a complex ETL (Extraction, Transformation and Load) process to get up and running. Investigate DQ's speed to use is often only believed once witnessed. 

What are the benefits to administrative personnel? 

(TS) When it comes to their data’s quality, admin staff have a tough job. They can find themselves up against barriers including complexity and lengthy delays because they depend on IT or other business units for assistance. With Investigate DQ, we sought to break through some of these barriers and empower admin staff to have full visibility and ownership of their data. 

We are also big on the idea of the “self-service” model. We have clients that have adopted Investigate DQ to the extent they design, implement, and support their own ongoing data quality rules with minimal guidance from us.  

The direct-to-source feature is particularly compelling because it allows for a faster installation given there is no overhead to understand another data set or configure a transformation layer before applying data quality checks.  

This approach promotes self-serviceability as there is no learning curve to understand how to check for data quality. The data quality checks simply rely on the existing knowledge of the source system, so users don’t have to rely on any specialist product knowledge to develop and maintain their own data quality. 

What does the product road map look like? 

(TS) I’m very excited to be looking after the product road map. We have some great ideas we want to develop that will continue our emphasis on managing data quality while also empowering our users through more functionality and innovative concepts. We will be rolling out new offerings to compliment Investigate DQ as part of our ongoing commitment to improving the quality of data across the industry. 

 

Get in touch with our expert team and transform your data quality today.

Previous
Previous

Data Disease Costing Superannuation Funds Millions

Next
Next

Managing Annual Member Statements with an eye for Data Quality