A privacy impact assessment (PIA) is a way for organisations to assess and address privacy risks when theyre collecting, using, or sharing personal information.
Doing a PIA will help your organisation:
check whether your project compiles with privacy laws
identify and minimiase privacy risks (e. g. data breaches)
give customers or clients certainty that their information is safe
improve your information management systems. There are real risks for your organisation if your project involves personal information, or intrudes on peoples privacy, and you dont do a PIA. Weve developed tools and documents (listed below) to help you succeed. If you’re unsure whether you need a PIA, fill out our brief privacy analysis template, which will help you decide.
The European Commission has determined that New Zealand has an adequate level of protection for personal data transferred from the European Union. Essentially ‘adequacy’ says that our legislation isn’t the same as Europe’s, but its outcomes are similar and can be trusted.
Privacy Commissioner Michael Webster says this is good news for trade and ease-of-doing business in the digital age and helps ensure smooth cross-border data transfers.
Only a small number of countries have achieved EU adequacy status, and this recognition is important for New Zealand in a global business environment.
“Adequacy means that New Zealand is a good place for the world to do business; we have strong privacy protections in our legislation and are an empowered regulator,” says Michael.
“Privacy regulation supports the digital economy, with the Privacy Act being the only statute that requires data security safeguards to be in place; that underpins our relationships with key trading partners, which is crucial for any global operator.”
“An example of that is New Zealand’s $400 million video and computer games sector, which is enabled by good data protection standards.”
Adequacy demonstrates the importance, both at home and on the world stage, of having strong privacy protections. However, it’s not a ‘set and forget’ situation; the countries we benchmark ourselves against are strengthening their privacy and data protection laws now and we need to too.
“Now is the time for New Zealand to evolve our data protection laws if we want to retain adequacy,” says Michael.
“We live in dynamic times with significant technological advancements, yet we’re operating on a Privacy Act that is based on policies agreed in 2013.
“This past decade has seen the development and widespread adoption of technologies such as biometrics and AI and does not account for new risks to children’s privacy.
“We need to ensure our Privacy Act keeps up with global privacy standards or risk that we may no longer be one of the safest places to process personal information,” says the Privacy Commissioner.
The European Commission noted the Privacy Amendment Bill as a positive move. The Commissioner is keen to support progress on that. His office also recommends the following developments to the Privacy Act 2020:
– A set of specific amendments to make the Privacy Act fit-for-purpose in the digital age.
– A civil penalty regime for major non-compliance alongside new privacy rights for New Zealanders to better protect themselves.
– Stronger requirements for automated decision making and agencies demonstrating how they meet privacy requirements.
Proceeding with the above amendments is important, especially if the Government proceeds with the proposed Consumer Data Right.
The adequacy decision for New Zealand was first adopted on 19 December 2012.
A staff member had saved the woman’s completed review form, which they believed was a blank template, to their computer desktop for easy access the next time they needed to send a form of this type to a future client. In fact, while the front page was blank, subsequent pages contained the woman’s personal information. The staff member sent the woman’s completed form, believing it was a blank template, to other clients. One of those recipients then located the woman on social media and informed her that she had received her information from the agency in error. Additionally, an anonymous person contacted people who knew the woman, revealing her personal information that had been contained in the form. We were satisfied the agency’s actions had breached IPPs 5 and 11 in this case.
Summary
The extent of this privacy breach caused the complainant and her whānau significant stress and inconvenience over many months. As a result of careless filing, the complainant’s personal and other sensitive information was disclosed to multiple people and ultimately was circulated. Despite the agency’s efforts to contact those who had received the form, there was no way to guarantee that the information was no longer in circulation. The woman reported feeling that her mana and integrity were diminished because of the agency’s failure to keep her information safe. We agreed the agency’s breaches of IPP’s 5 and 11 met the threshold in section 69(b)(iii) of the Privacy Act and resulted in significant humiliation, significant loss of dignity, and significant injury to the feelings of the woman. We worked with the parties to resolve the matter. The agency provided a formal letter of apology and agreed to remind staff of the importance of keeping personal information safe. The agency ensured the document was removed from the staff member’s desktop and reminded all staff about the correct process for sending templates and storing client information. The agency also agreed to pay the woman $15,000 in compensation for the interference with her privacy.
Commentary
This case note highlights the importance for agencies to strengthen their internal privacy guidelines and be mindful when filing and sending documents. Agencies need to make sure that it is simple for staff to send and use the templates and documents they require for their day-to-day work. If systems are not easy to use, then staff might resort to workarounds (like saving things to desktops) that result in great risk to personal information. Agencies also need to create a culture of checking emails and attachments before they go out. The greater the sensitivity of the information, the more checks that should be made. As we saw in this case, a simple mistake can result in significant harm to individuals.
Our website uses cookies so we can analyse our site usage and give you the best experience. Click “Accept” if you’re happy with this, or click “More” for information about cookies on our site, how to opt out, and how to disable cookies altogether.
The Privacy Commissioner has announced that his Office will be consulting on new rules specifically for biometrics. Biometrics are increasingly collected by facial recognition technology (FRT), retinal scans, and voice recognition. The Privacy Commissioner has announced that his Office will be consulting on new rules specifically for biometrics. Biometrics are increasingly collected by facial recognition technology (FRT), retinal scans, and voice recognition. Michael Webster, Privacy Commissioner, said, New Zealanders need to have trust and confidence in the use of biometrics by organisations and businesses. My Office will issue a biometrics code exposure draft in early 2024 that well open for everyone to have their say on.
The Privacy Commissioner announced on 23 November 2023 that he will be progressing draft privacy rules for biometric information. Read the announcement or the explainer document.
In early 2024 our Office will be publicly consulting on an exposure draft of a biometrics privacy code. The exposure draft will propose new rules for agencies who want to collect or use biometric information in automated processes. The process will include public consultation so everyone can have their say. Email the team to be notified when the exposure draft is open for consultation. A code would change how some of the principles in the Privacy Act apply when organisationsuse technology to analyse biometric information. A code provides regulatory certainty to agencies using or seeking to use biometrics and it would allow for beneficial uses while safeguarding against privacy risk or harm.
Our website uses cookies so we can analyse our site usage and give you the best experience. Click “Accept” if you’re happy with this, or click “More” for information about cookies on our site, how to opt out, and how to disable cookies altogether.
2023 Sir Bruce Slane Memorial LectureWe are very happy to announce that we have a new date for this years Sir Bruce Slane Memorial Lecture. The Sir Bruce Slane Memorial Lecture is a highlight on our events calendar and honours New Zealands first Privacy Commissioner and will take place Wednesday 6 December 2023. 2023 Sir Bruce Slane Memorial Lecture speaker, Professor Nicole Moreham. The lecture will be delivered by Professor Nicole Moreham on the topic of “Balancing privacy and other interests in the social media age”. Venue: Victoria University Law School, Lecture Theatre 1, Government Buildings, 55 Lambton Quay, Pipitea (Stout Street side). Time:5. 00pm for drinks and nibbles with the hour long lecture commencing at6. 15pm.
PleaseRSVP yesto rsvp@privacy. org. nz before 22 Novemberfor catering purposesif you are attending the talk in person.
Schools concerned about behaviour in their bathrooms have been calling the Office of the Privacy Commissioner (OPC) asking about their options for using CCTV. Schools want to know if they can install Closed Circuit Television (CCTV) networks in children and young peoples bathrooms to deter negative behaviour such as vaping and bullying. Privacy Commissioner Michael Webster says, bathrooms are highly sensitive zones for privacy and theres some clear points that schools need to consider first.
These are,
Schools need to be open with their communities about using CCTV and have clear signage and notices of it. They need to take care to focus cameras away from intimate activity (i. e. , toileting and changing). Schools also need to be sure theyre not using audio to pick up conversations without additional privacy assessments done first.