Posted on

Socialising data increase both efficiency and productivity

dataWhilst self-service data analytics has many benefits, there are also a number of problems, most notably the lack of reusability of datasets. The cost of low quality data is staggering, around $3.1 trillion according to IBM, and according to their studies, analysts spend around half their time finding and correcting bad data. This is a waste of time and opportunity, and stops big data form being as useful as it could potentially be.

The problem lies in the fact that there are too many one-off projects in terms of data acquisition, it takes them too long to receive and clean good data, and it isn’t saved. In other words, no one knows what anyone else has done. This means that other organisations waste time trying to recreate and replicate previously available datasets, rather than being able to delve into an already existing goldmine of data.

The solution is to approach data in a way that is more akin to social media. By socialising data acquisition, and integrating traditional approaches to self-service data with processes already common to social media platforms, we end up with high quality, reusable data sets. This methods provides operational repeatability, and means that the overall data acquisition process becomes quicker, easier and more efficient.

By being more collaborative, we ensure that good quality, trustworthy data is easily available to all. Bad data needs to be filtered for the useful stuff, and this is yet another time consuming process. Having a reservoir of independently reviewed, useful datasets solves this problem. For data analysts, this inevitably leads to an increase in productivity, as they no longer have to spend inordinate amounts of time trying to recreate old datasets. A more collaborative culture at a cross organisational level will contribute to better business results

Posted on

Many sectors are falling behind because of the advent of big data, but housing isn’t one of them

big-dataIn terms of data acquisition, technology is advancing at a rapid rate. So fast, in fact, that many industries are getting left behind. The housing sector, however, is managing to stay ahead of the curve by incorporating big data into their business plans. Some criticise the advent of technology for pushing traditional methods aside, but the general consensus is that the positives outweigh the negatives. Big data can provide agents with the information which allows them to tailor a housing solution to each individual client’s needs.

Appraisals are the first port of call when it comes to using big data. Traditionally a complex process, the use of big data can simplify and streamline the whole affair. Analysis of the data can reveal the overall worth of the neighbourhood properties, which can be used, in conjunction with other factors, to determine the value of the property in question. This can also be used going forward. Having an accurate value in mind can also help tell the housing business whether potential projects are worth the investment or not. The acquisition of big data can also be implemented by insurance companies, to analyse the local area and make a more accurate determination about what cover people might need.

According to some studies, almost half of all houses sold for over $5 million are bought through shell companies, and there is a worry that these companies use the property for money laundering. Using big data to analyse patterns and monitor suspicious activity allows business owners to haver better control of their organisation, and fewer worries about the potentially criminal intentions of any interested parties in an expensive property.

If a property has been on the market for a long time, and there is no obvious reason why, such as a bad review, then businesses can use big data to engage in targeted marketing. By studying housing trends in certain demographics, and combining this information with data about the property in question, you can tailor your search criteria so that you only target the specific kind of person that is most likely to buy the property. This means that you end up with fewer properties lingering on the market for extended periods of time.

Posted on

Keeping bad data can yield some surprisingly helpful information.

dataCleaning up bad data is one of the less romantic aspects of data science but it is an unfortunate necessity. But in our haste to make our data as clean as possible, might we be overdoing it a little bit.

Obviously we all want our data to be free from mistakes, error strewn information is of no use to anyone. But when cleaning it up, often, these bad inputs and outliers get discarded without a second thought, when actually they can be useful. Isn’t it better to understand where an error originates from, so as to prevent it happening again, than simply throwing the bad result away.

When you get a bad reading, there are a myriad of different reasons to explain it. From faulty equipment or inexperience operators to localised anomalies, data can be affected negatively by all sorts of factors.

Now when these results come in, and it is immediately apparent that the data is faulty, it is common practice to remove it so as to prevent it affecting the other results. But some data analysis companies have encouraged their employees to treated bad data as an outlier, and ignore it when collecting the results. Crucially, it isn’t deleted immediately. This means that these outliers can be analysed to determine what the problem at the core of the reading is. So next time you have to cleanse your big data, keep hold of the bad results and use them to try to prevent more errors in the future.

Posted on

UAV’s working beyond the line of sight will be at the forefront of aerial inspections.

UAVUsing unmanned UAV’s to provide Beyond Visual Line Of Sight, or BVLOS inspections, allows for customers to access safe and reliable data about many industrial assets.
Having the option to use BVLOS to complete inspections and surveys make the whole process far cheaper and more efficient than traditional survey methods.

In areas that are hard to access, the choice is either invest heavily in construction of roads and buildings to allow for a ground team to complete the survey. But with the onset of UAV technology like that being developed by Lockheed Martin, this can be prevented, saving time and money.

There are also environmental consideration that must be accounted for. For example, if a corporation needs to survey some portion of the Alaskan wilderness for an oil pipeline, they must invest heavily in infrastructure designed for a cold climate, such as winter roads.

Construction work of this magnitude can be immensely damaging to the surrounding wildlife. With a BVLOS survey, this can be avoided easily. Another issue is that traditional survey methods, such as a ground team or low flying helicopters can scare away wildlife, which is often counterproductive given that many wildlife surveys are carried out in remote, inaccessible areas.

Posted on

How companies may be forced to put “back doors” in their encryption software

Tech companies are coming under increasing pressure to create “back doors” in their encrypted messenger software, so that, with a warrant, the police and intelligence service can view previously private conversations of suspects. End to end encryption involves scrambling messages in transit, and then, if the recipient has the right key, unscrambling them. The system is what WhatsApp use by default to protect the privacy of the messages sent on their app.

Analysing the metadata can give clues as to what was in the sent messages, including when they were sent, how many people received it, and the location of the sender and recipient at the time the message was dispatched. But crucially, the metadata cannot tell one the specific content of the messages.

This kind of encryption can be used to protect all kinds of data, in many data acquisition environments, for perfectly valid and acceptable reasons, but is also means that it can hide criminal activity, including financial information, from the authorities.

But many companies say that there is no way to install a back door in their encryption that only the police and security services can use. It means opening up private information to all, the good and the bad. The companies say that overall, encryption keeps us safer than we would be in back doors were installed, despite the potential for users to abuse the privacy it provides.

Any back door, into an encryption software, is surely open for abuse, but in this case, does the end justify the means?

Posted on

New buoys could save lives by providing real time storm data acquisition

Typhoons are an ever-present danger to nearly 1 billion people on coastal and inland regions of the north western Pacific Ocean. They can cause billions of dollars of damage, and cause thousands of deaths. Given this, early warning detection systems are vital. To that end, a research group in Taiwan is attempting to develop a network of high tech data acquisition buoys, that can gather real time data, in high resolution and relay it to Taiwan’s Central Weather Bureau.

Data can be sent every 12 minutes, which means that these buoys, if a network of them can be successfully implemented, could save lives. These new systems accommodate more meteorological and oceanic sensors than traditional moorings.

These buoys have also been specifically designed to record data about such storms on the open ocean. Lithium batteries can support the buoy with power for around 18 months before it needs recharging. The data acquisition system and electric power scheme are all designed to save power, and a specially designed cutter has been installed so as to prevent the buoy becoming entangled by fishing lines.

The problem with setting up such a network is not just that the buoys are expensive to make, but they require a number of precise instruments to calibrate, as well as support form ships and satellite communications.

So why aren’t satellites sufficient? They are very useful, but their sensors don’t penetrate far under the water. Therefore, using them in conjunction with a network of high tech buoys provides the most effective system to pre-empt, study, limit the effect of, typhoons.

Now, this is a world away from the data acquisition challenges that most people confront. Not only is the data itself complex and hard to analyse, the acquisition environment presents issues which are difficult to overcome. But if the system can be made to work it could provide an essential early warning, and save lives.

Posted on

Sweating Data Acquisition

The popularity of wearable health technology has sparked something of a fitness revolution in recent years, and is now set to take the medical sector by storm. Researchers are looking to develop wearable electronics that enable continuous monitoring of health conditions and make real-time, non-invasive data acquisitions possible – and human sweat appears to be is the secret ingredient for ongoing data acquisition.

Sweat analysis has long been  a tried and tested means of data acquisition within the medical field. Perspiration biomarkers help to identify human health conditions, sweat analysis is used for sports doping tests, and a sweat chloride test is the standard for diagnosing cystic fibrosis.

Being the most easily accessible body fluid, it provides accurate and insightful physiological information for analysis – but current methods of data acquisition involve collecting the patient’s perspiration in a cup and sending it off to the lab for analysis, which is both laborious and time-consuming. At the 2016 International Electron Devices Meeting (IEDM) in July 2016, a team from the University of California offered an intelligent software solution to this problem.

The “Wearable Sweat Bio-Sensor” – made from ultra-low power, flexible, printable electronics – measures the detailed sweat profiles of a wide spectrum of analytes including metabolites, electrolytes and heavy metals during various indoor and outdoor physical activities, enabling accurate, real-time data acquisition and analysis. Sensors are attached to the body like patches and monitor the patient’s condition at a molecular level, allowing health professionals to detect each and every change. This technology has life-saving potential.

Posted on

Three Data Acquisition and Tech Trends for Businesses in 2017

1)    Augmented Reality will become Enterprise Reality.

As technology powers forward, businesses should be experimenting with a wide range of interactive media – including augmented, virtual, and mixed reality technologies. AR offers companies exciting opportunities to engage closely with customers, and also enables them to train their staff in a virtual environment, maximising productivity and limiting risks. As an estimated 100 million people will be shopping in augmented reality settings by 2020, it’s essential that companies start to think about how the software can be used within their organisation. The data acquisition opportunities that sit within AR are significant, as are the ethical issues associated with its collection and use.

2)    The Future of Innovation is Technology that Augments the Human Experience.

With the general population becoming increasingly tech-savvy, companies will have to think carefully about access and design to give them the edge over their competitors. Design thinking – an approach visualising the end experience first, and compounds people, processes and technologies to achieve that vision – will become imperative for maintaining a competitive advantage. In 2017, user experiences will undergo a shift; typing will give way to more immersive methods of digital interaction like gestures, haptics, voice, gaze etc., offering seamless engagements between users and machines.

3)    From Data Acquisition to Data Intelligence.

Today’s workforce comprises a mixture of humans, intelligent systems and devices, which naturally brings its own complex set of guidelines regarding data acquisition and protection. In a recent survey, the majority of mid-level executives were grappling with ethical issues stemming from the use of smart technologies in the workplace. For example, some car insurers are using special telematics devices to monitor consumer driving habits, but whilst this may seem like a good idea for rewarding careful drivers with a premium discount, are consumers truly aware of who owns the data being produced by the car? And would they be comfortable knowing that insurers use the telematics device as a trigger to call 999 in the event of an accident? As technology advances further and further, there is a greater need for education surrounding the ethics of digital technologies – and it must fall to organisations to provide this education for their employees.

Posted on

Privacy and Data Acquisition

What does privacy actually mean in an age when everything we do is being watched? When data acquisition is a part of our everyday life. Our personal data is constantly at risk of being shared over the internet? As the World Wide Web’s 25th birthday approaches, it seems appropriate to review the progression of digital privacy protection in a bid to foresee what the future has in store for us.

As far as privacy protection is concerned, the internet still has much to learn. Sometimes it feels as though businesses are experimenting with our personal data, either for personalised promotion, targeted advertising, or sketching down a potential new customer profile.

It’s true that consumers benefit from a user experience tailored to their individual needs that also enables new sources of value creation, however, recent findings have shown that internet users often feel as though they have no control over how their personal information is stored and used by third parties. Julia Angwin, a privacy activist who led a ‘Wall Street Journal’ investigation into the software used by companies to target customers, has steadfastly attempted to protect her digital privacy. This, however, requires a commitment most consumers are reluctant to make, not least because it requires them to forego using popular internet services. Even the most privacy-conscious internet users, who are no strangers to secure browsers and burner phones, lack an impregnable privacy solution that encompasses their complete digital footprint. Existing measures, like apps or management solutions like Silent Circle, only protect users’ communications through their smartphone, which is grossly ignorant of today’ s multiscreen digital experience. Companies can use cross-device tracking techniques to access profiling data from multiple sources, further compromising a user’s ability to remain anonymous. Recent advances in deep learning and natural language processing will also make it harder for consumers to protect their conversations from prying ears.

Unfortunately, legislative changes geared towards safeguarding personal data online is likely to take time, owing to a laxity with personal information that would never be acceptable in the offline world, a general lack of awareness surrounding the ways in which it is being used, slow moving regulators, and a tendency amongst consumers to overlook T&Cs protecting malicious developers. Administrations should be forced to publish simplified versions of their T&Cs via apps, just as it is mandatory for medicines to be dispensed with information about side effects, and telecom adverts are obliged to disclose associated fees.

In the wake of several high-profile security breaches of recent years, consumers have awoken to the devastating consequences of sharing their information online. However, it’s not all bad – increased awareness of the ways in which data can be shared and exploited helps to pave the way for a brighter future. Activism amongst consumers for regulations that protect their personal information is more important than ever. As a society, it is our responsibility to shift the focus away from business innovation, and on to consumer rights.


Posted on

Data Acquisition Sequence and Interpretation

The ‘Internet of Things’ (IoT) plays an important role in the business insight and strategies of modern corporations. An interconnected network of things, people, machines, devices, and environments make up the IoT, and supply businesses with the necessary data to inform the development of their strategies going forward.

The sequence of the acquisition and interpretation of IoT data is dubbed ‘The Three As’, which represent:

  • Data Acquisition. This stage can often be overlooked by IT communities because they automatically receive the data pre-digitised in 1s and 0s. IoT data is predominantly collected in an analog format, and converted to digital using sensors and specialist software.
  • Data Analysis. This involves interpreting the data collected, and using these findings to inform further action. Analysis falls under three subheadings:-
    • Business Insight. This includes asking questions like, where will a particular demographic go on holiday this summer? Or, is the check-out line too long?
    • Engineering Insight. This involves knowing when things may need maintenance checks, or overseeing if products on a manufacturing line are being built and controlled properly.
    • Scientific Insight incorporates medical data analysis – for example, learning if a tumour is benign, or if new weather patterns will affect crop growth.
  • Action. This could be either physical or executive, i.e., operating a robotic arm, or knowing when to move inventory to increase sales.

Data acquisition is imperative to the long-term strategy of contemporary businesses; it helps to identify areas for growth and expansion, boosts revenue, and assists in tailoring services to customer needs