Garbage in, garbage out. The quality of information coming out of a system can never be better than the quality of information that was put in. It may sound sort of like one of Newton’s laws of physics, it’s not. But at the very least, it’s an always relevant guiding principle for anyone who uses and analyzes data. Actionable analytical insights rely heavily, first and foremost, on having clean, structured data. This is, of course, much easier said than done in the reality of scientific research and business operations. Organizations often lack the time and/or resources needed to be manually digging through and manipulating vast quantities of data.
Wearing the Hat of Data Analyst
With the rise of technology and the increased amount of data, concerns about accuracy and value have emerged. At Denver Startup Week 2019*, data quality and cleansing were at the forefront of many discussions, evidently affecting vastly different industries as well as job titles.
During one panel discussion, Keely Nolan, Director of Customer Success at GoSpotCheck, mentioned that regardless of our role within a company, we often wear the hat of Data Analyst and find ourselves cleaning up data before any analysis can happen. This manual “data wrangling” has many companies pondering automation and how solutions like Machine Learning can reduce manual tasks and the potential for human error.
“Dirty” Oilfield Data
One of the leading issues facing the oil and gas industry is messy, disorganized data. This was a primary reason Well Data Labs was established – to tackle some of these industry obstacles, specifically in the area of completions data.
Even as innovative and emerging technology and software arise, this only presents further challenges to traditional industry standards. Additionally, this highlights other potential compatibility issues as data is passed between different platforms or entities. Each platform, of course, having its own file requirements and standards for a successful import. Sometimes the sheer volume of data alone poses basic questions about managing and standardizing the data to be able to add value and extract insight. Software that has the power to objectively consume and structure data from any source, is a fundamental that Well Data Labs has reinforced, however, it is not without its roadblocks and is something that we continuously tackle.
Single Source of Truth
At Denver Startup Week, Lucas Thelosen, VP of Professional Services at Looker, spoke of the importance and responsibility of leadership to create a culture around the need for clean data and a central storage repository. At the end of the day, culture seems to be one of the key indicators of whether an organization is data-driven or not.
Well Data Labs thrives on being an agnostic solution for Operators, no matter what service company is being used. This follows the Central Data Model, which governs the crucial single source of truth for our customers. This model ensures that data is in a single location, is consistently structured, and maintains its accessibility. Everyone who needs access has access and can make informed decisions from the same dataset. If clean, coherent data is valued and made a priority early on, you will likely see this solid foundation positively trickle down into other areas of an organization.
*About Denver Startup Week
Built by the community, for the community, Denver Startup Week, which started in 2012, is a celebration of everything entrepreneurial in Denver and is the largest free event of its kind in the world. In 2018, the event brought 17,000 people together to celebrate Denver’s thriving entrepreneurial ecosystem and showcase and build the city’s culture of innovation. The week-long event includes sessions, presentations, panels, workshops, happy hours, celebratory events, job fairs, and more. Well Data Labs employees, many of whom attend every year, have found tremendous value discovering what other Denver startups are working on and the discussions around best practices.