Posted on

Are the police force infringing on the privacy of the public?

data acquisition

Data acquisition is fast becoming a common and accepted part of large companies; collecting data and information in order to acquire knowledge about users, and therefore determine their interests, beliefs and often location. This can cause difficulties with privacy, and if there is a breach it can cause companies a lot of trouble. In recent years, the bodycam industry has been growing steadily, with particular usage increasing in the police force.

Police forces are equipping more and more officers with bodycams, to record events and permit more data acquisition. Furthermore, the latest AI technology is being developed and installed to make all the videos captured by the bodycams searchable for data acquisition.

Specifically, a new branch of AI called deep learning is being implemented by the police, and promises to take the internet by storm. Using sophisticated data acquisition technology, it can save time sifting through mountains of often useless information and focus on things that are actively important, pertinent and relevant to police investigations.

This increase in the use of bodycams is supposed to increase police accountability, but there are issues of privacy to consider. How do we stop them becoming nothing more than mobile CCTV cameras? The data acquired in this manner will have to be regulated in some way so as to avoid any privacy issues from arising. Exactly how police forces and other corporations will incorporate AI systems such as these into their everyday strategy remains to be seen, it seems that, in terms of data acquisition, these systems are very much part of the future.

Posted on

Should big data be regarded is the same way as a commodity or a national asset?

The value of datathe big data industry is increasing exponentially, and is expected to reach $100 billion by 2025. As a consequence, trillions of dollars of value will be added to other industries. The problem in his new age of digital economy is that each country has its own rules, in terms of data regulations. The current global system hasn’t adapted.

There are three aspects that have been identified as potentially problematic. Number one is having a right to privacy, number two is the absence of a legal framework suitable for the processing of big data, and the third is lack of governmental controls. Current infrastructure, as we have mentioned, is insufficient for this new digital age. A new digital platform provider may be required to keep up.

The solution is a common marketplace of shared, cross-industry digital platforms. And new training methods will certainly be needed in order to allow employees to properly and effectively deal with the inevitable challenges thrown up by managing big data.

The key to all of this will be investment in people, training and technology, so that the new digital economy can be managed properly. As the big data industry evolves, these issues will require solving, and new regulations will come into play.

Posted on

Google is using data backlogs to track everything you’ve ever done

Evdata-storageery time we interact with any online site, data about us is stored. The more we interact with a website, the more information about us is retained. Google, one of the websites we use the most, has begun to dredge through the vast quantities of data about us it stores, in order to match credit card histories to things like browser, location and advertising histories, amongst others.

Despite the mingling between data sets from different corporations, Google claim that you private data remains anonymous, but for many people, the mere idea that Google and Facebook combine their large reserves of data to effectively profile their users, may be an unwelcome shock for their many users. And some companies even mix offline data with online, for example Facebook, who signed deals with companies like Experian to do just that.

After accumulating this data, Google can create a super-profile of its users. This is helpful because it allows Google to dictate terms to the many advertisers who use the site. Knowing the extensive viewing and purchasing histories of its users, they have an idea about who might be interested in buying a certain merchants product before the merchants themselves do.

Key to this is having easy access to location data. Using Android, Google can track your movements, which is what gives it such an advantage when it comes to identifying potential customers. Location data allows Google to insert itself into every transaction.

Posted on

Why is data privacy important?

It is vPrivacyital for big data analysists to be transparent and open about the data they collect and how they use it, to keep the concern of the users at bay.
Unfortunately, the usage of data is not addressed as the important privacy issue that it is. Data breach is the most common, and arguably most damaging, privacy mistake your company can make. A data breach can reach the headlines and cause tremendous damage and embarrassment to your company, along with other topics such as discriminatory algorithms and illegal bias, inaccurate information due to relying on fake news, and identity reverse engineering, which basically consists of undoing anonymization.

Big data analytics have the power to gain a huge amount of information, which, if breached, can cause people huge problems. For example, your bank details are probably sitting in many databases, which, if accessed by someone else, could be detrimental.

Next, we come to the way information leads to knowledge, and unknown companies having knowledge about their customers can be deemed extremely unnerving. Information is where companies use data collected to understand user’s behaviours. From this, knowledge can be gained, where you connect the dots between different areas of a user’s life, such as their personal interests, political and religious views and shopping habits.

Over time, information can be gathered and therefore knowledge about a person leads to wisdom: an extremely personal profile of a user, cultivated over many years. People are often unaware of this and would be very uncomfortable if they were aware that someone knows so much about them. This is arguably the biggest problem faced by large data analytic companies.

In order to maintain a positive relationship with users, it is vital to be transparent and upfront about what and who you analyse. You should let your users know what your analytic capabilities are, generally what you know about them and why. If you find that you can’t find adequate reasoning for the information you have, you should probably reconsider it in order to avoid a scandal.
However, despite the importance of transparency, it is also important not to give away your strategic secrets – you are a competitive business, and if you give away too much information, your value disappears. Therefore, you must be transparent, but keep your vital strategic information to yourself. Try to explain what you do, rather than how you do it.

Primarily, you must let people know what you know, and what you’re capable of doing with this information. This won’t dissipate the privacy issue, but over time, transparency will build a relationship with your users and this trust is what will stop you becoming a scandal.

Posted on

Socialising data increase both efficiency and productivity

dataWhilst self-service data analytics has many benefits, there are also a number of problems, most notably the lack of reusability of datasets. The cost of low quality data is staggering, around $3.1 trillion according to IBM, and according to their studies, analysts spend around half their time finding and correcting bad data. This is a waste of time and opportunity, and stops big data form being as useful as it could potentially be.

The problem lies in the fact that there are too many one-off projects in terms of data acquisition, it takes them too long to receive and clean good data, and it isn’t saved. In other words, no one knows what anyone else has done. This means that other organisations waste time trying to recreate and replicate previously available datasets, rather than being able to delve into an already existing goldmine of data.

The solution is to approach data in a way that is more akin to social media. By socialising data acquisition, and integrating traditional approaches to self-service data with processes already common to social media platforms, we end up with high quality, reusable data sets. This methods provides operational repeatability, and means that the overall data acquisition process becomes quicker, easier and more efficient.

By being more collaborative, we ensure that good quality, trustworthy data is easily available to all. Bad data needs to be filtered for the useful stuff, and this is yet another time consuming process. Having a reservoir of independently reviewed, useful datasets solves this problem. For data analysts, this inevitably leads to an increase in productivity, as they no longer have to spend inordinate amounts of time trying to recreate old datasets. A more collaborative culture at a cross organisational level will contribute to better business results

Posted on

Many sectors are falling behind because of the advent of big data, but housing isn’t one of them

big-dataIn terms of data acquisition, technology is advancing at a rapid rate. So fast, in fact, that many industries are getting left behind. The housing sector, however, is managing to stay ahead of the curve by incorporating big data into their business plans. Some criticise the advent of technology for pushing traditional methods aside, but the general consensus is that the positives outweigh the negatives. Big data can provide agents with the information which allows them to tailor a housing solution to each individual client’s needs.

Appraisals are the first port of call when it comes to using big data. Traditionally a complex process, the use of big data can simplify and streamline the whole affair. Analysis of the data can reveal the overall worth of the neighbourhood properties, which can be used, in conjunction with other factors, to determine the value of the property in question. This can also be used going forward. Having an accurate value in mind can also help tell the housing business whether potential projects are worth the investment or not. The acquisition of big data can also be implemented by insurance companies, to analyse the local area and make a more accurate determination about what cover people might need.

According to some studies, almost half of all houses sold for over $5 million are bought through shell companies, and there is a worry that these companies use the property for money laundering. Using big data to analyse patterns and monitor suspicious activity allows business owners to haver better control of their organisation, and fewer worries about the potentially criminal intentions of any interested parties in an expensive property.

If a property has been on the market for a long time, and there is no obvious reason why, such as a bad review, then businesses can use big data to engage in targeted marketing. By studying housing trends in certain demographics, and combining this information with data about the property in question, you can tailor your search criteria so that you only target the specific kind of person that is most likely to buy the property. This means that you end up with fewer properties lingering on the market for extended periods of time.

Posted on

Keeping bad data can yield some surprisingly helpful information.

dataCleaning up bad data is one of the less romantic aspects of data science but it is an unfortunate necessity. But in our haste to make our data as clean as possible, might we be overdoing it a little bit.

Obviously we all want our data to be free from mistakes, error strewn information is of no use to anyone. But when cleaning it up, often, these bad inputs and outliers get discarded without a second thought, when actually they can be useful. Isn’t it better to understand where an error originates from, so as to prevent it happening again, than simply throwing the bad result away.

When you get a bad reading, there are a myriad of different reasons to explain it. From faulty equipment or inexperience operators to localised anomalies, data can be affected negatively by all sorts of factors.

Now when these results come in, and it is immediately apparent that the data is faulty, it is common practice to remove it so as to prevent it affecting the other results. But some data analysis companies have encouraged their employees to treated bad data as an outlier, and ignore it when collecting the results. Crucially, it isn’t deleted immediately. This means that these outliers can be analysed to determine what the problem at the core of the reading is. So next time you have to cleanse your big data, keep hold of the bad results and use them to try to prevent more errors in the future.

Posted on

UAV’s working beyond the line of sight will be at the forefront of aerial inspections.

UAVUsing unmanned UAV’s to provide Beyond Visual Line Of Sight, or BVLOS inspections, allows for customers to access safe and reliable data about many industrial assets.
Having the option to use BVLOS to complete inspections and surveys make the whole process far cheaper and more efficient than traditional survey methods.

In areas that are hard to access, the choice is either invest heavily in construction of roads and buildings to allow for a ground team to complete the survey. But with the onset of UAV technology like that being developed by Lockheed Martin, this can be prevented, saving time and money.

There are also environmental consideration that must be accounted for. For example, if a corporation needs to survey some portion of the Alaskan wilderness for an oil pipeline, they must invest heavily in infrastructure designed for a cold climate, such as winter roads.

Construction work of this magnitude can be immensely damaging to the surrounding wildlife. With a BVLOS survey, this can be avoided easily. Another issue is that traditional survey methods, such as a ground team or low flying helicopters can scare away wildlife, which is often counterproductive given that many wildlife surveys are carried out in remote, inaccessible areas.

Posted on

How companies may be forced to put “back doors” in their encryption software

Tech companies are coming under increasing pressure to create “back doors” in their encrypted messenger software, so that, with a warrant, the police and intelligence service can view previously private conversations of suspects. End to end encryption involves scrambling messages in transit, and then, if the recipient has the right key, unscrambling them. The system is what WhatsApp use by default to protect the privacy of the messages sent on their app.

Analysing the metadata can give clues as to what was in the sent messages, including when they were sent, how many people received it, and the location of the sender and recipient at the time the message was dispatched. But crucially, the metadata cannot tell one the specific content of the messages.

This kind of encryption can be used to protect all kinds of data, in many data acquisition environments, for perfectly valid and acceptable reasons, but is also means that it can hide criminal activity, including financial information, from the authorities.

But many companies say that there is no way to install a back door in their encryption that only the police and security services can use. It means opening up private information to all, the good and the bad. The companies say that overall, encryption keeps us safer than we would be in back doors were installed, despite the potential for users to abuse the privacy it provides.

Any back door, into an encryption software, is surely open for abuse, but in this case, does the end justify the means?

Posted on

New buoys could save lives by providing real time storm data acquisition

Typhoons are an ever-present danger to nearly 1 billion people on coastal and inland regions of the north western Pacific Ocean. They can cause billions of dollars of damage, and cause thousands of deaths. Given this, early warning detection systems are vital. To that end, a research group in Taiwan is attempting to develop a network of high tech data acquisition buoys, that can gather real time data, in high resolution and relay it to Taiwan’s Central Weather Bureau.

Data can be sent every 12 minutes, which means that these buoys, if a network of them can be successfully implemented, could save lives. These new systems accommodate more meteorological and oceanic sensors than traditional moorings.

These buoys have also been specifically designed to record data about such storms on the open ocean. Lithium batteries can support the buoy with power for around 18 months before it needs recharging. The data acquisition system and electric power scheme are all designed to save power, and a specially designed cutter has been installed so as to prevent the buoy becoming entangled by fishing lines.

The problem with setting up such a network is not just that the buoys are expensive to make, but they require a number of precise instruments to calibrate, as well as support form ships and satellite communications.

So why aren’t satellites sufficient? They are very useful, but their sensors don’t penetrate far under the water. Therefore, using them in conjunction with a network of high tech buoys provides the most effective system to pre-empt, study, limit the effect of, typhoons.

Now, this is a world away from the data acquisition challenges that most people confront. Not only is the data itself complex and hard to analyse, the acquisition environment presents issues which are difficult to overcome. But if the system can be made to work it could provide an essential early warning, and save lives.