Using big data to improve farming, one cow at a time

Candice So Candice So Published: 11/28/2013
(Image: GrowSafe).

When GrowSafe Systems Ltd. first began studying cattle in 1990, it all started with a team of engineers and computer scientists, working with 386 processors and trying to understand the flood of data coming their way.

But it’s only been in the past six years or so that the Airdrie, Alta.-based company has moved out of the researching phase, with the commercial applications of its data really beginning to take off, says Alison Sunstrum, the company’s CEO.

“We’ve actually been working on this for a long time. We laugh about it being an overnight success because we started out in 1990 looking at animal behaviour, and what we did then compared to what we do now is very different,” she says.

“The reason we are a success now is because advances in computer processing power, advances in Internet accessibility, advances in routers, advances in everything – it’s a convergence … We’ve now been able to do quite a few different things.”

Now, more than 20 years later, GrowSafe has won an award for how it collects and analyzes big data – a technology giving farmers the ability to make better decisions about their cattle.

Netting a 2013 Ingenious Award from the Information Technology Association of Canada, GrowSafe was named the winner for best technological innovation for a small to mid-sized business in the private sector.

What gave it an edge is its GrowSafe Beef technology, a proprietary way of collecting data in the feedlot and providing solutions to cattle raisers, Sunstrum says. About half of its customers are researchers in university agricultural departments, while half are commercial farmers. Within the farming group, there are some large clients, but their smallest customer is a farm with about 200 heads of cattle.

GrowSafe’s solution works by strategically placing biometric sensors in the water troughs and feeding areas of the feedlot. These sensors constantly collect data on the weight of the animals, plus their movements, drinking behaviours, feeding behaviours, temperatures, and all kinds of other data.

Every night, starting at midnight, GrowSafe’s computers begin to analyze the data against 30 different algorithms looking at areas like animal health and industry economics. The data then gets pulled into a report or data visualization so a customer can understand it.

However, where GrowSafe really comes in handy is in spotting any outliers in the data provided, raising flags about whether individual animals are sick or need to be looked at more carefully.

Clues about an animal’s well-being might stem from how often it drinks water, how restless its movements are, or how often it’s eating – all of which is shown in the biometric sensors. The animal’s radio-frequency identification tag will then help GrowSafe employees pinpoint the exact animal, and they can even automatically spray paint the individual animal a bright green so a farmer can take it out of the pen and investigate.

(Image: GrowSafe). Sensors have identified a sick cow and have spray painted the cow green so a farm employee can remove it from its fellows.
(Image: GrowSafe). Sensors have identified a sick cow and have spray painted the cow green so a farm employee can remove it from its fellows.

Still, agribusiness isn’t an industry renowned for its technological applications. There are many farmers who still take some convincing about whether an animal is sick, even when data indicates something is off, Sunstrum says.

For example, one of GrowSafe’s customers appeared to have a sick animal, so GrowSafe flagged it, sending multiple messages over the course of a few days. When the customer finally answered, he said the animal didn’t look sick. The customer even sent photos of the cow, showing there didn’t appear to be anything wrong with it. Three days later, the animal died.

There’s a good chance many of GrowSafe’s commercial customers have a similar story, simply because many farmers have been breeding cattle for years and are relying on past experience. It can create something of a clash, considering GrowSafe’s employees are engineers and computer scientists first, and animal experts second. So right now, most of the company’s commercial customers are farmers who are early adopters.

“The issue is that, the majority of people in the agriculture industry, we really are a resistant population. We are resistant to technology,” Sunstrum says. “But if you have really good data, and continuous data, you can sometimes dispel conventional myths. We think we know a lot, but when you start collecting big data, you really start to learn what you don’t know.”

“What we sometimes to say to folks in our industry is that we’ve basically taken cowboy logic – so all those things you think you know, so you know how to look at an animal, you know how to do certain things – we’ve taken your logic, or your knowledge, and we’ve built that into our software.”

However, none of this would have been possible if computing hadn’t developed the way it has, Sunstrum says. By switching to an Intel Core I7 processor, GrowSafe has been able to cut its processing time to three hours, down from six.

Seeing smaller businesses take this technology forward is very rewarding, says Elaine Mah, director for Canada at Intel Corp. While big data has been on the scene for a while now, it’s traditionally been the province of enterprise organizations. Yet GrowSafe, with its 20 employees, is one of a growing number of small businesses that is taking advantage of the data in front of them, she says.

Still, she says she feels there’s still some way to go before big data hits its stride.

“I would say we’re still very much in the learning and growing stage of big data. People are just starting to understand what is the data they hold, and how do I manipulate it to get that intelligence out of it,” Mah says. “Then there will be the interesting process of doing data mashups, and looking at the data in unexpected ways and dimensions that help to drive the next level of insight for an organization.”

For GrowSafe, it took a little more than two decades to be able to use big data to help farmers in practical ways, Sunstrum says.

“We found a place where our thoughts and our imaginations, sort of our vision, suddenly became much more possible,” she says.

“Where we find ourselves today in this technology, we actually can acquire the amount of data we do, analyze the data in real time, and provide a solution back. It’s been a long road.”

However, the company still hopes to be able to do more, aiming to build out its services so it can actually treat sick animals directly in the pen, instead of just flagging the ones that need attention.

And with big data moving forward as quickly as it is today, here’s hoping that this time around, it will take less than another 20 years.

6 big risks of big data for Boards to consider

Catherine Aczel Boivie Catherine Aczel Boivie Published: 08/08/2014

There is not a day when big data isn’t mentioned in the news as a differentiator or an issue for organizations. But what does that mean for the very top level of company management when considering the topic?

First, a simple description of big data: data that is too big to deal with in an excel spreadsheet. Big data is a strategically important corporate asset and as such governing Boards need to have oversight on that asset. This oversight should not involve getting mired in the details  in keeping with the well-known Board mantra, “Noses in, fingers out” referring to Boards having the oversight of management but not doing management’s job. The following provides additional background on big data; discusses some of the risks Boards need to consider in that area; and sets out basic questions that Board members should ask to ensure that the risks associated with big data are properly managed.

In the last few years, the term “big data” has become a hot topic in boardrooms and companies around the world. It is not one of those technology-terms that require pages of explanation. It is all the data an organization collects, such as financial, customer specifics, orders, invoicing, online transactions, to name just a few. This includes the data that organizations store in databases (commonly referred to as structured) as well as all other data the organization collects which are not in databases (unstructured). Big data are in fact large amounts of data that are brought together and analyzed to provide input in the business decision making processes.

In the last few years big data’s role has been credited in the growth of Google’s search engine and the success of IBM’s Jeopardy champion, Watson. Big data analysis is being used as a differentiator by Amazon with features like “other people who ordered this item also…” Big data has also allowed social networks to analyze chats for product features that can be used to solicit advertising revenue. One of the main functions of Boards is to manage the organization’s risks.

Here are some of the top risks of big data that Boards need to oversee:

1. A Business strategy must be in place to manage the organization’s data.

Describe the framework for data collection and analyses, such as should it be focussed on its products or customers or both. As an example, it could specify that it will provide a full picture of each of the customer’s activities regardless of the products and services being utilized. At one of the Boards I served on, the strategy was to create a large customer database which was organized by customer number.  This allowed  data to be collected and stored  on each customer  from the various databases,  which in turn  could be readily accessed for appropriate analysis and subsequently reported on in high-level dashboard reports for executives and the Board and in more detailed reports for senior management. Just a cautionary note, the responsibility for overseeing this strategically important asset should be assigned to a senior executive, as opposed to the IT division. This is a strategic business initiative that will affect every department in the organization, not an IT project of housing and maintaining the data.

2. Proper data security and data protection must be in place.

Big data is sometimes linked to Big Brother in terms of invasion of privacy. Hence, policies and procedures need to be established to ensure its proper use by specifying procedures such as where and what data will be stored; who will have access to what information; and how long information will be retained.

3. Data has to be organized from a corporate view.

It should be stored in a strategic way that avoids the “islands of data” or silos of data syndrome, where one island has all the financial information, the other the customer data, while the third holds product information and so on.  Often each application has its own database, resulting in organizations having a myriad of databases. There are significant costs not only to maintain these databases but also to build and continuously maintain the bridges that connect all the databases or islands. The more bridges that are needed, the higher the related costs .For example with five databases (a very low number) there could be as many as 120 bridges. There is also the matter of diverting resources from data analysis to maintenance which would inevitably affect the productivity of the organization.

4.  The right analytics tools need to be in place to analyze the data.

Analyzing a sea of information to produce accurate and timely reports that lead to actionable business projects is a key differentiator for organizations.  Analytics tools are expensive and have a steep learning curve for everyone involved . In addition, there is a major shortage of experts who can analyze the data. In effect they have to be “bilingual” since they need to know the potential uses of those analytics tools as well as understand the organization’s business intimately.

5. Big Data comes at a big cost.

Storing, organizing and analyzing the data will cost even more if it is done haphazardly, without a plan and oversight by senior management and the Board. Without such a plan it would be like planning a high rise that requires a new feature like an additional elevator. If it is not done at the building stage, the cost of incorporating that feature will dramatically increase! When calculating the total cost of big data key factors such as the analytics software licences, additional staff, new hardware required and the cost of collecting and managing data all need to be included.

6. Big data analysis will also require changes in the corporate business processes.

For example, in the HR department the main sources of information for hiring staff used to come only from the résumés, interviews and checking references. With big data there is a lot more information obtainable about the applicant from on-line sources such as Facebook and LinkedIn.  These changes would make more information available for decision making in the hiring process. Other big data opportunity examples: fast food drive-ins quickly changing the drive-in digital menus to those that are quick to prepare when there is a long line up;, airlines adjusting the seat price based on demand and inventory.

How does the Board ensure that big data is managed well? Here are five questions that the Board should start asking:

  1. What are the goals of the organization’s Big Data strategy?
  2. Who is in charge of the organization’s Big Data strategy and how is s/he going to ensure that it doesn’t get bogged down with analysis paralysis?
  3. How will management identify the information that drives value for the organization?
  4. Who will have the responsibility to determine what is relevant?
  5. Who will ensure that proper security and data protection regulations and procedures are followed?

If Boards do not deal with big data, the organization will miss opportunities, trends and directions. Processes will remain the same and decisions will be made without considering strategically critical data that could be available if only someone had the foresight to include it in the organization’s big data strategy.   With the availability of big data, relying on gut feel when making decisions is not enough anymore!

While Boards cannot be involved in the day-to-day activities of managing big data, they do need to ensure that there is a clear vision and collaboration across all business areas to make the most of the organization’s big data asset.

 

OCAD University President Sara Diamond: big data stories must be told with multi-sensory approach

Brian Jackson Brian Jackson Published: 08/07/2014

As a part of the Centre for Information Visualization and Data-Driven Design (CIVDDD), Dr. Sara Diamond is helping to bring together strong science capacity with creative story telling. She believes representing data sets in a visual and sensory way will be an important way we understand our world in the 21st century. For example, OCAD University is helping CBC Newsworld make its own ‘holodeck’ to explore its archive.

Can chief data officers become the heroes of big data?

Chris Thierry Chris Thierry Published: 05/28/2014

 

While bigger businesses in specific industries have employed Chief Data Officers (CDOs) for awhile, they remain relatively rare birds.

Not surprisingly the rise of CDOs and the use of the term “big data” in the popular vernacular have coincided.

The rise of big data

The first time that the term “big data” appeared was in 1998; it was coined by John Mashey.

Mashey used the term to refer to the vast amounts of data which were becoming increasingly more complex and unmanageable to process, even back then.

Enter the new kid

The first CDO was appointed in 2003 at Capital One, the US financial institution. As of this year, Gartner estimates that there are currently more than 100 CDOs, mostly working in financial services and the bureaucracy. This is liable to increase, says recruiting firm Russel Reynolds, who claim that by next year half of Fortune 500s will have CDOs on staff. Interestingly, the Federal Communications Commission (FCC) made a recent statement in creating a CDO for every department, a total of 10 CDOs for 10 bureaus and offices.

So just what is it CDOs do?

The role of the CDO defined

The job of the CDO is to focus not only on big data for better data mining and analysis; s/he must find ways to control the volume of data and make it “accurate, actionable and accessible,” to quote TD Ameritrade’s CDO, Derek Strauss.

In a more micro sense, here’s a non-exhaustive list of the principal functions of the CDO, from the Executive Report, The Role of Chief Data Officer in the 21st Century by the Cutter Consortium:

  • Data governance

The CDO must organize and execute everything – policies, procedures, structures, roles, and responsibilities – that is required for proper management of data assets. This includes establishing things such as decision rights, rules of engagement, and accountability over data.

  • Standards

The CDO must establish a uniform standards for data naming and acronyms, data modeling, data defect thresholds, data quality improvement, security, and privacy.

  • Business Intelligence (BI)

The CDO is responsible for BI. This means CDOs oversee things like decision-support applications which allow managers across the organization to make informed decisions

  • Data in the Cloud

The CDO has to carefully consider the use of cloud storage. Generally speaking, huge volumes of data can make cloud storage a good option, but benefits must be weighed against the cost and the risk of exposing data.

  • Security and Privacy

The CDO has an important role to play with the security department to determine the level of data security appropriate to the company.

With these new guidelines, the CDO is perceived as being data savvy and capable of changing and improving the big data culture of a company. Should the CDO be a permanent executive or a temporary consultant who trains employees and guides and remodels a company’s data approach only for the duration of his contract?

What about the CIO?

Among CIOs, there is divided opinion on the issue of big data. Some do not see managing it as a great challenge and others think it’s a complete nightmare. CIOs seem to understand the importance of data analytics though – indeed 70% of those surveyed in a report by KPMG said it was a very important business driver.

Whatever their view of big data, the fact is a CDO can’t replace a CIO. Properly managed, it should not boil down to a territorial battle because CDOs allow CIOs to focus on infrastructure and to create value from the use of technology.

The Road Ahead

It’s too soon to predict the long-term future of these roles. For now, it can be said that the CDO title has gained some acceptance in the last 10 years and that CIOs are still critical for businesses. What we do know is that the world’s data now doubles every 2 years, and that the digital universe will grow by a factor of 300 by 2020.

Smart businesses are thinking about how to manage this explosion.

Big data a ‘Holy Grail’ of opportunity for startups in cloud era

Brian Jackson Brian Jackson Published: 11/05/2012

It’s the inaugural Extreme Startups demo day at Berkeley Church in downtown Toronto and its CEO Jeff Lawrence’s turn to evangelize his startup to the congregation – and much like a Sunday sermon, the promises he’s giving are hard to believe.

“We’re doubling the revenue of online stores without them actually having to do any more work,” he says about Granify, his e-commerce startup. “We figure out what will influence each shopper. Then we take action convert those people and get them to buy.”

There’s a secret to converting Web site window shoppers into buyers as successfully as a Jesuit converts pagans to Catholicism. E-commerce wonders like Amazon.com know what it is – an algorithm that can evaluate what a person is interested in and dynamically responds by showing them products they are more likely to want. Because Amazon has a massive IT infrastructure behind its e-commerce site – one that has also become an IT services business of its own – it can collect “big data” on the millions of transactions being completed.

Jeff Lawrence, CEO of Granify, presents at Extreme Startups demo day.

This allows Amazon to convert 18 per cent of its Web site visitors into buyers, Lawrence tells the crowd. Most Web stores convert just two per cent. So Granify will help those stores level the playing field by providing the big data crunching capability that made Amazon successful. Its service provides the IT infrastructure of a larger enterprise operation to any modest Mom & Pop Web shop.

But the “big data” trend that Lawrence refers to doesn’t stop there. It’s an all-pervasive trend in the IT market that sees this buzz word attaining just as much clout as “cloud computing.” More startup firms are seeing the opportunity of building a business on the principle of crunching a huge set of previously unmanageable data and turning it into digestible and actionable nuggets.

It’s a trend that’s moving horizontally across sectors and attracting attention from entrepreneurs and venture capitalists alike. In 2012, $28 billion in IT spending was driven by big data demands, Gartner Research says. It estimates that big data will directly or indirectly drive $120 billion in IT spending next year. Coming into the limelight now, the beginning of the big data movement can trace its roots back a decade.

Elephants never forget

That’s when the emergence of open source systems like Hadoop had some in the IT industry feeling like revolution was in the air, recalls Jason Rose, vice-president of solution marketing for business intelligence at SAP AG. While the notion held by some enthusiastic advocates that open source software would eventually disrupt the enterprise software market hasn’t fully come to bear, it did present a common problem the open source community could sink its teeth into.

“Most enterprises have seen their traditional information stream explode,” Rose says. “There’s suddenly a variety of information coming at you from different angles.”

Traditional data bases can’t make sense of the unstructured and transient data that larger firms wanted to understand. In other words, you’re not going to make Microsoft Access records for every customer that visits your Web site in order to track what path they enter on, what pages they visit, and whether they come back later in the month. You need a framework like Hadoop (named after the creator’s son’s toy elephant, because elephants never forget).

Hadoop and some alternatives offer a batch processing method for unstructured data. Derived from Google’s own file system architecture, it enables an application to call on whole clusters of computers to scour petabytes of data. The Hadoop developer community has grown to include large Web brands that build upon its base model to support their massive cloud-based services. Take Facebook’s Hive infrastructure, for one, it’s used to provide data analysis and queries.

Hive mentality leads to user insights

Hive is being used by Toronto and San Francisco based Kontagent to offer in-depth Web analytics to developers, markets, and product managers. Before big data was a hot term, Kontagent was playing in the market, says co-founder and CEO Jeff Tseng.

“We were the first company to put that online, in the cloud, so people could access it without having to use their own infrastructure,” he says. “All of a sudden you have access to what Google had years ago.”

Kontagent can track Web users down to the individual level and glean their social interactions and behavior across multiple channels. It can answer questions for owners of Web sites, mobile apps, and social apps such as whether users are sticking around after they use an app for the first time. The service is about more than just brute force analysis of multiple data sets. About 10 to 15 per cent of Kontagent’s staff are PhD data scientists developing algorithms to pull the intelligence out of the maelstrom.

As the need to track hundreds of thousands or millions of Web site visitors has grown at multiple Web sites, software solutions like Kontagent’s have grown to meet the demand. Its kSuite platform tracks more than 150 million active users of more than 1,000 social applications. Its customers include Konami, Adult Swim, BBC, Electronic Arts, Warner Brothers, and Pop Cap.

Tseng got into the big data game early with Kontagent, but with venture capitalist interest high and the open-source, cloud-based tools readily available, many other big data startups are spilling onto the scene. They’re addressing all different types of industry verticals – from finance to construction to e-commerce.

Big data for babies

Launching at Startup Weekend in Ottawa last June, Allison Gibbins has founded Simplify Analytics Inc. to get more honest – and therefore accurate – customer research data. Drawing on inspiration from her own pregnancy group, Gibbins has introduced BabySimplify as her’s company’s first Web site. It helps expectant parents figure out exactly what they need, and don’t need, for their baby.

The model is almost the opposite of Amazon’s recommendations algorithm, which always is persuading its visitors to buy more.

“We thought we could switch the model it actually benefits the consumer,” Gibbins says. “You’re not using it to get them to buy more, you’re using it to buy smart.”

The business play for Gibbins is to incent expectant mothers to answer research questions honestly. She’s looking segmentation of that demographic into behavioral categories. Baby monitor companies might just benefit from knowing how stressed out certain mothers are to the sound of a newborn’s cry, for instance. It’s about honing products and marketing messages to the right audience.

Much like SAP’s software business has branched out across 24  industry categories to apply its business intelligence software for specific solutions, startups are now cropping up to serve different niches in the big data market. Next, we’ll see what companies take up the offer to gain better data insight.

“It comes back to organizations fundamentally understanding their business model and how they differentiate,” he says.

Granify customers like Four Corner Store have already accepted the big data gospel. They’ve made more revenue since signing up with Granify, and are using that to bring in more customers via Google Adwords, Lawrence says. That’s just one of 50 customers seeing an average of 103 per cent sales uplifts, thanks to Granify’s team that includes two PhD data scientists.

With results like that, cries of Hallelujah may be heard as the big data revolution continues to unfold.

Brian JacksonBrian Jackson is the Editor at ITBusiness.ca. E-mail him at bjackson@itbusiness.ca, follow him on Twitter, connect on , read his blog, and check out the IT Business Facebook Page.

Security Symbol Magnifying Glass
Image courtesy of Shutterstock.com

The three V’s of big data – and the security risks that come with it

Candice So Candice So Published: 07/31/2014

We’ve been hearing the phrase “big data” being tossed around among companies, industries, and organizations for some time now – but what does it really mean?

For Jerrard Gaertner, president of the Canadian Information Processing Society, big data presents a lot of potential for businesses, the public sector, and all kinds of industries – but any work with big data needs to be done with data security in mind.

While he teaches courses on this topic at Ryerson University and the University of Toronto, he managed to distill a lot of the information into an hour-long mini lecture during Wednesday’s TASK meeting in Toronto. Gaertner gave a talk on what big data means, outlining not just its potential, but also the risks of adopting it without thinking of security first.

“In many cases … you’ve got an organization that’s got absolutely wonderful security and policies and procedures and segregation of duties, and everybody has a [Certified Information Systems Security Professional certification]. But big data is over here, and we’ve got our crown jewels in there, and a couple of dozen people have access to absolutely everything,” he said, addressing a room of security professionals during his talk. “I would just caution you that big data tends to be ignored or tends to be forgotten because it’s so new.”

So what is big data? For Gaertner, he characterizes it as having at least three V’s:

– Volume
This is huge amounts of data – not just gigabytes or terabytes, but potentially petabytes or exabytes.

– Variety

Big data includes a variety of data, which aren’t just housed within Excel files or Word documents. This can include every file format out there, Gaertner said.

– and Velocity.

“Most big data installations – you can’t necessarily control how quickly the data comes in,” he said. For example, he mentioned how many companies have marketing departments that do sentiment analysis, meaning they analyze tweets on Twitter, posts on Facebook, or other areas of social media to figure out how a new product is performing in the marketplace and how people feel about it. However, given this is social media and Twitter users alone can create as many as 5,000 tweets a second, those seeking to harness big data can’t control how much data is coming in, nor how quickly, Gaertner said.

Given how so many businesses and industries want to tap into big data and the insights it can bring, it’s not surprising people are eager to just upload their data and start using open source software from frameworks like Apache Hadoop.

Still, Gaertner told the audience of security professionals this is where security and risk management come in. He named a number of factors that need to go into a strong, effective implementation of big data, such as creating appropriate research facilities, using relevant data sources, ensuring the hardware used has the capacity to process the data, using the right software and analytics tools, training staff in proper procedures – the list goes on.

However, a large chunk of that list requires security professionals to lend a hand, and people can’t just be left alone to play with big data without safeguards and controls, he said.

“Does the [chief security officer] or privacy officer know you’ve dumped all the information you own into a bucket and you’re playing with it?” Gaertner said, adding one of the biggest risks with big data is putting all of an organization’s data in one place, or all of its eggs in one basket.

He added security professionals also need to ask about the “provenance” of the data, or where it came from. After all, there are business risks, ethical risks, and privacy risks to using data from just anywhere and not adequately protecting it.

And of course, one of the most important pieces of security in any organization is to ensure employees are well-trained and educated in understanding the risks, especially when it comes to big data. That’s even more important than relying upon the tools and layers of defense set up to protect an organization’s data.

“You’re all security professionals,” Gaertner said to the room. “You know – never rely on the technology. It’s people, people, people.”

Thank you for reading.

Special Issue

Big data, business intelligence, and making actionable decisions based on an ever-increasing data store, often in near-real time


Table of Contents