NCSC publishes third Cyber Security Assessment Netherlands (CSAN-3)

On October 28th, the third Cyber Security Assessment Netherlands (CSAN-3) was published(.pdf, in English) by the Dutch National Cyber Security Center. These are the core findings of CSAN-3:

  1. Several trends show considerable IT dependence, rising fast due to advances such as hyperconnectivity, cloud computing and the ease with which the internet is used as an enabler. The potential impact of incidents occurring is all the more obvious. 
  2. Digital espionage and cyber crime remain the biggest threats to both government and the business community. This concerns:
    a) Digital espionage originating from a foreign state, aimed at government and the business community. Activities have been identified originating from, among other countries, China, Russia, Iran, and Syria.
    b) IT takeovers by criminals by means of malware infections, aimed at government, the business community and citizens. Criminals are becoming more daring in their ways of earning money quickly, for example, phoning citizens, or confronting them with shocking images in ransomware.
    c) Manipulation of information (fraud) by criminals, aimed at the business community, most obviously internet banking fraud, which victimises both banks and citizens. 
  3. States can develop and deploy advanced tools, while cyber criminals continue to develop their existing tools. Clearly visible in the past year has been the rise of a commercially available cyber services sector, ‘cyber crime as a service’, which offers far easier access to criminal tools to various parties. 
  4. Citizens, businesses, and governments alike are regular victims of botnets and ransomware. Malware can mutate so quickly that anti-virus programs are unable to even detect its presence. Although botnets are mainly used to manipulate (financial) transactions, certain incidents (such as Pobelka) show that the collateral damage of information stolen through botnets can be enormous. 
  5. The IT sector continues to be vulnerable. Following a few years of reduced levels, the number of openly published vulnerabilities in software is increasing again. Cloud services, mobile services and innovative devices all result in new vulnerabilities. 
  6. The end-user is burdened with a big responsibility for security, but more often than not has little influence or even knowledge of the vulnerabilities he confronts in the devices and services. 
  7. Public and private parties are starting up initiatives, both separately and together, to increase digital resilience and in anticipation of the ever-increasing dependence on IT and changing threats. The effectiveness of these initiatives can only be measured in the long term. 
  8. Disruption in the IT sector is displayed publicly, particularly when it comes from Distributed Denial of Service (DDoS) attacks. Resilience has been inadequate at times, which led to a decline in the availability of online services provided by organisations. In addition, DDoS attacks disrupted basic services such as DigiD and iDeal, and this had a chain effect on governmental organisations and businesses that use these services. It is not clear who is behind the DDoS attacks. 
  9. As yet, a broad group of organisations is unable to implement important basic (technical) measures, such as patch and update management or a password policy. Where individual organisations do have their basic security well organised, it appears that shared services and infrastructure are still vulnerable, which in turn leads to a risk for interests that transcend particular organisations. 
  10. The inherent dynamics of cyber security demand a new approach. Static information security measures are no longer sufficient; organisations need greater insight into threats (detection) and need the capacity to deal with the threats (response).

Furthermore, the report states:

In conclusion, a) dependence on IT by individuals, organisations, chains and society as a whole has grown; b) the number of threats aimed at governments and private organisations has risen, mainly originating from states and professional criminals; and c) digital resilience has remained more or less at the same level. Although more initiatives and measures are being taken, they are not always in step with the vulnerabilities, and basic security measures have not always been put in place.

Table 1 gives insight into the threats that various actors use to launch attacks on governments, private organisations, and citizens.  […]

Table 1:
EOF

Strengthening the Dutch-German cooperation on digital security

On October 25th, the Dutch National Coordinator for Counterterrorism and Security (NCTV) published (in Dutch) the following press release:

Strengthening the Dutch-German cooperation

On October 25th, 2013th, the Dutch National Coordinator for Counterterrorism and Security (NCTV), Dick Schoof consulted with the German Federal Government Commissioner for Information Technology, State Secretary Cornelia Rogall-Grothe. The aim of the meeting was to strengthen cooperation between the Netherlands and Germany in the field of cyber security.

Main topics discussed were the strategies of the two countries in the field of cyber security. Both strategies aim to better protect society against cyber attacks and disruptions. They require support from and close cooperation between national and international government agencies, operators of IT infrastructures, scientific institutes and the public. Preventive security measures, information exchange and coordination of actions are the most important topics.

In this context, Ivo Opstelten, Dutch Minister of Security and Justice, and his German colleague Dr. Hans-Peter Friedrich, Interior Minister, agreed on organizing a cross-border bilateral exercise in the area of cyber security. To this end, an exchange is currently taking place between the competent authorities, the German Federal Office for Information Security Technology (BSI) and the National Cyber ​​Security Center (NCSC). The bilateral exercise, which is scheduled for the 1st quarter of 2014, focuses primarily on operational issues.

Related:

EOF

SIGINT and wiretapping: the Dutch Intelligence and Security Act 2002

UPDATE 2015-07-02: the Dutch government released a new intelligence bill into public consultation. Details here.
UPDATE 2013-12-04: on December 2nd 2013, the Dessens Committee published their final report. Details here.
UPDATE 2013-11-09:
added Oversight on Dutch SIGINT is still broken (blog).

UPDATE 2013-11-02: this Guardian article of November 1st 2013 cites the following from internal GCHQ notes on the Tempora program (=wiretapping fibre-optic cables that carry internet traffic):

“GCHQ also maintains strong relations with the two main Dutch intelligence agencies, the external MIVD and the internal security service, the AIVD. ‘Both agencies are small, by UK standards, but are technically competent and highly motivated,’ British officials reported. Once again, GCHQ was on hand in 2008 for help in dealing with legal constraints. ‘The AIVD have just completed a review of how they intend to tackle the challenges posed by the internet – GCHQ has provided input and advice to this report,’ the country assessment said.
‘The Dutch have some legislative issues that they need to work through before their legal environment would allow them to operate in the way that GCHQ does. We are providing legal advice on how we have tackled some of these issues to Dutch lawyers.'”

Obviously said “legislative issues” include the Dutch WIV 2002 Article 27 (=SIGINT selection) restriction to non-cablebound communications. As stated below, the WIV 2002 is currently in its final phase of an official review by a temporary committee (Dessens Committee) that exists between February 1st 2013 and January 1st 2014. It is expected that changes to the WIV 2002 will be proposed, and that the legal power of SIGINT selection will be extended to also cover cablebound communications. The question then is: what safeguards and oversight will be proposed? Article Dutch readers are referred to this article by Bits of Freedom.

============ ORIGINAL POST IS BELOW THIS LINE ============

The Dutch Intelligence and Security Act 2002 (WIV 2002) is the legal framework within which the Dutch intelligence and security services AIVD (general) and MIVD (military) operate. On February 1st 2013, a committee was formally established to review the WIV 2002. The committee (‘Dessens Committee’) consists of the following members:

The committee is tasked with addressing the following questions:

 

  • Did the WIV 2002 bring what the lawmakers intended?
  • Did the WIV 2002 in practice turn out to be a workable instrument for carrying out the tasks of the services?
  • What problems and issues can be identified in the practical application of the law?

 

And the committee is tasked with giving `particular attention’ to the following topic that is very relevant to the Dutch Joint SIGINT Cyber Unit (JSCU) that was recently established:

  • Are the investigative powers of the services adequate, are the safeguards sufficient? Current and future developments, such as in technology and in the area of cyber, must be taken into account.

The publication of the committee’s report was expected to appear in September 2013, but it (still) is overdue. I will update this post when the report is published.

Paragraph 3.2.2 of the WIV 2002 contains Article 18 to Article 33 that regulate all special investigative powers:

  • surveillance and monitoring of persons and property (Article 20);
  • deployment of agents (Article 21);
  • establishment of legal persons (Article 21);
  • searches of private places, including housing and closed objects (Article22);
  • examination of objects to establish the identity of individuals (Article 22);
  • opening letters and packages (Article 23);
  • intrusion into an automated work (Article 24); [hacking]
  • interception of communications, telecommunications or data exchange (Article 25);
  • exploring non-cablebound telecommunications (‘searching’) (Article 26);
  • undirected [=bulk] interception and selection of non-cablebound telecommunications (Article 27);
  • retrieval of traffic and subscriber data from providers (Articles 28 and 29);
  • physical intrusion in support of other powers (Article 30).

In 2009, the Review Committee on the Intelligence and Security Services (CTIVD) published (.pdf, in English) a review report on the application by the AIVD of Article 25 WIV 2002 (wiretapping) and Article 27 WIV 2002 (selection of undirected intercepted non cable-bound [=wireless] telecommunications).
For purposes of international comparison, discussion, etc., I hereby cite from that report the sections that explain Article 25 and Article 27. For better understanding, I recommend reading the entire review report (39 pages) — which also discusses necessity, proportionality and subsidiarity. When the WIV 2002 evaluation report finally appears, I will update this post to include the committee’s opinion on Article 25 and Article 27. It is nearly certain that the SIGINT powers will be extended; notably, that selection of undirected intercepted telecommunications will also be legally possible on cable-bound telecommunications.

2.1 Article 25 WIV 2002

Article 25 paragraph 1 WIV 2002 reads as follows:

“The services are entitled with the aid of a technical device to wiretap, receive, record and listen in on any form of conversation, telecommunication or data transfer by means of an automated work, irrespective of where this takes place. The power, as mentioned in the first sentence, also includes the power to undo encryption of the conversations, telecommunication or data transfer.”

Based on this article the AIVD may for example record conversations using a microphone, wiretap telephone conversations, read email messages and monitor a person’s internet behaviour.

The article is broadly formulated. It involves any form of conversation, telecommunication or data transfer via an automated work. This means, among other things, that not only telephone conversations can be wiretapped, but also that data transfer taking place via a telephone line can be wiretapped [3]. For example, fax messages, or text messages. The advantage of such a broad formulation is that the AIVD can respond to new communication technology .

As shown by the description in the article it does not matter where the conversation, telecommunication or data transfer takes place (‘irrespective of where this takes place’). A microphone may therefore be placed everywhere, including in someone’s dwelling. Whether deployment of a means in a certain place is justified, is assessed on the basis of several assessment criteria including necessity, proportionality and subsidiarity. The assessment criteria are explained in section 4 of this review report.

During the drafting of the WIV 2002 the question was raised whether the words ‘despite where this takes place’ can mean that conversations, telecommunications and data transfer in other countries can be wiretapped from the Netherlands. The government provided the following answer to this:

“First of all we note that the power of these services to wiretap conversations, telecommunications and data transfer as provided in Article 25 among other things, does not extend beyond the jurisdiction of the Dutch State. For the Dutch legislator cannot unilaterally create jurisdiction in other countries. However, this does not alter the fact that application of the power provided for in Article 25, in particular insofar as this concerns the interception of telecommunications as well as application of the powers laid down in the, by memorandum of amendment, inserted Article 25a [Committee: now Article 26] and Article 26 [Committee: now Article 27], can also extend to interception of telecommunications with an origin or destination abroad.” [4]Application of the methods referred to in Article 25 WIV 2002 implies a serious intrusion on a person’s privacy, because cognisance is taken of the content of the communications of persons and organisations in a directed way. By application of this special power, the privacy of the telephone and telegraph laid down in Article 13 of the Constitution is violated. In the drafting of the WIV 2002, it was chosen not to provide a mandate arrangement for the special powers violating the more specifically provided rights of the Constitution, such as the right to inviolability of the home and the privacy of the telephone and telegraph [5]. This means that pursuant to Article 19 in conjunction with Article 25 paragraph 2 WIV 2002, only the Minister of the Interior and Kingdom Relations is competent to grant the AIVD permission for wiretapping. This permission can be given for a maximum of three months (Article 19 paragraph 3 WIV 2002). At the AIVD’s request to this end, the permission may be extended each time by three months.

2.2 Article 27 WIV 2002

Article 27 paragraph 1 reads as follows:

“The services are entitled to receive and record undirected intercepted non cable-bound telecommunications using a technical device. The power referred to in the first sentence also includes the power to undo encryption of the telecommunications.”As discussed in the previous section, Article 25 WIV 2002 provides that the AIVD may wiretap, receive, record and listen in on telecommunications. This provision provides for the directed wiretapping of the telecommunications of a person or an organisation known to the AIVD or of a telephone number known to the AIVD.

Article 27 paragraph 1 WIV 2002 also allows for the AIVD to intercept and records undirected telecommunications. This concerns non cable-bound telecommunications, i.e. ether traffic in the broadest sense of the word. In particular this refers to the interception of telecommunications traffic that takes place via satellites [6]. The AIVD does not intercept all ether traffic. Pursuant to Article 26 WIV 2002 (the so-called “searching”) it is first assessed what frequencies or satellite channels are possibly interesting to keep under observation. If during searching, frequencies or satellite channels are taken cognisance of that may yield interesting intelligence for the AIVD, the AIVD may select undirected intercepted information sent via such frequencies or satellite channels. The term ‘undirected’ is referred to because, beforehand, it is unclear what the yield will be and whether it will contain any information relevant to the AIVD. In case of undirected interception and recording, no cognisance is taken as yet of the contents of the communication. The bulk information is only stored in the computer systems.

The AIVD does not need permission for this undirected interception and recording of information (Article 27 paragraph 2 WIV 2002). However, if the AIVD wishes to take cognisance of the contents of the communication, the AIVD must first ask the Minister of the Interior and Kingdom Relations for permission to select intercepted information on the basis of certain criteria, after which the selected part of the intercepted information, can be taken cognisance of. The power to select has been included in Article 27 paragraph 3 WIV 2002:

“The services can select the data collected by exercising the power referred to in the first paragraph on the basis of:
a. data concerning the identity of a person or an organisation;
b. a number as referred to in Article 1.1, under bb, of the Telecommunications Act, or any technical feature;
c. keywords related to a subject described in more detail.”The selection criteria mentioned under a and b do not require much explanation. These concern, for example, names, address details or social security numbers (sub a) or telephone numbers or IP addresses (sub b). Data collection based on these selection criteria concerns specific persons and organisations, as a result of which the search action is referred to as directed. Therefore for selection based on these data the same regime must be followed as with the application of Article 25 WIV 2002, which means that it is only the Minister of the Interior and Kingdom Relations who can give permission, for a maximum period of up to three months, after which a request for extension for another three months can be submitted.

For the selection based on keywords related to a subject to be described in more detail (sub c) a different arrangement has been formulated. In this case, data collection is not focused on a person or an organisation, but it is important for the investigations the AIVD is involved with (for example proliferations of chemical weapons)7 in a general sense. Here, the keywords do not relate to persons or organisations, but to a specific subject. Upon introduction of this power in the WIV 2002, the following explanation was given:

“A list of keywords related to a subject will as a rule consist of (combinations of) specific technical terms and specifications in various languages. Such a list is drafted in such a way that the selection system is optimally used to find the desired information. For example, a list of keywords in the context of an investigation into proliferation of certain dual-use goods to a specific country or region might consist, among other things, of the names of certain chemical substances and chemical compounds in combination with these countries or regions. A somewhat simplified example concerns the search for messages in which the word sodium (or the Dutch equivalent natrium) is found and also within two positions the word chloride or fluoride. A list of keywords to be used in an investigation into the export of a missile system to certain countries or regions might consist of various names by which the specific missile system is specified, any project names or designations of the various elements that make up part of the system in question.” [8]Because the personal privacy of persons and organisations is not directly at issue here – as the data collection is not directed at persons or organisations – the Minister of the Interior and Kingdom Relations may give permission for a longer period – namely for a maximum of one year – to select intercepted information in the context of the investigation of a certain described topic. Experts within the AIVD subsequently formulate keywords relating to this topic, on the basis of which the selection can be made.. Therefore the Minister of the Interior and Kingdom Relations does not need to give permission for the specific keywords. Legally, the permission regarding the formulated keywords must come from either the Head of the AIVD, or another officer appointed by him. However, the AIVD has opted for having this power exclusively exercised by the Head of the AIVD.

2.3 “Silent tap”

The Committee has established that the AIVD applies a special power – a “silent tap” – under the denominator of Article 28 WIV 2002 (retrieving traffic data) whereas in the Committee’s opinion this method falls under the description of Article 25 WIV 2002 (wiretapping telecommunications).

Article 28 WIV 2002 provides that the AIVD can retrieve (telephone) traffic data from providers of public telecommunications networks and public telecommunications services. This way the AIVD can obtain data about, among other things, the dates and times at which someone called and the telephone numbers used for making the contact [9]. Article 28 WIV 2002 does not serve to take cognisance of the content of the communications that take place via the telephone connection. In that case permission from the Minister of the Interior and Kingdom Relations would be required pursuant to 25 WIV 2002, because this concerns the interception of (any form of) telecommunication. This difference was touched upon briefly during the drafting of the WIV 2002, when the monitoring of military data traffic was discussed:

“In our opinion violation of the privacy of the telephone is involved if taking cognisance of the content of a telephone conversation is aimed at the very content itself. If the content of a telephone conversation is taken cognisance of purely as a brief part of an investigation into the identity of persons or institutions communicating with one another, we do not consider this as a violation of the privacy of the telephone. Rather, the [Committee: monitoring of military data traffic] is comparable with an investigation into traffic data. Such an investigation can indeed be considered as a violation of the right of privacy as laid down in Article 10 of the Constitution, but not as a violation of the privacy of the telephone laid down in Article 13 of the Constitution.” [10]No permission from the Minister of the Interior and Kingdom Relations is required for retrieving (telephone) traffic data (Article 28 paragraph 3 WIV 2002). It is sufficient that the request is made to the telecommunication providers by the Head of the AIVD (Article 28 paragraph 4 WIV 2002).

Article 28 paragraph 1 WIV 2002 provides that the request may pertain both to data already processed at the time of the request and data processed after the request. Therefore the AIVD may ask the telecommunication providers for the data regarding the use of the telephone during, for example, the past month, but the AIVD can also request to keep the service informed of this data in, for example, the next two weeks. In the latter case a technical facility makes it possible that the AIVD has immediate (‘real time’) access to the current data regarding the use of the telephone by a person. This is also referred to as a “silent tap”. Basically, a silent tap is a telephone tap, the difference being that the sound signal in a silent tap is not provided to the AIVD.

The Committee has established that in a silent tap the sound signal may not be forwarded to the AIVD, but that in a number of silent taps applied by the AIVD, the content of (a form of) telecommunication was taken cognisance of as it turned out that text messages were also reaching the AIVD via the silent tap. The Committee has come across a case of a silent tap, whereby the AIVD received approximately 150 text messages in a short period of time. This is an exception. In most cases a much smaller number of text messages are involved. The number of silent taps whereby text messages were received, moreover, involved a minority compared with the silent taps where no text messages were received.

The text messages ending up at the AIVD via a silent tap are (automatically) stored in the AIVD’s digital systems. The AIVD has indicated that at the moment it is impossible to avoid the inclusion of text messages in a silent tap. Nor is it possible at this moment to separate the text messages in the AIVD’s wiretapping room from the information provided to the operational teams. As it does not intend to take cognisance of the content of the communication when applying a silent tap, the AIVD is of the opinion that the received text messages are to be considered so-called by-catch.

The Committee does not share this view held by the AIVD. In Article 2 of the Governmental Decree to Article 28 WIV 200211 text messages are not designated as data that can be retrieved from the telecommunications services pursuant to Article 28. This is not without a reason. The Committee calls to mind the 2000 report of the Committee “Constitutional rights in the digital era”, also referred to – after its chairman,- as the Franken Committee. This report noted the following on traffic data:

“Traffic data does not concern the content of the data traffic.
Because Article 13 of the Constitution [Committee: inviolability of the privacy of correspondence, telephone and telegraph] has the very intention of protecting the content of the communication, traffic data is not protected by this Article. This data is however protected by Article 10 of the Constitution [Committee: respect for and protection of the personal privacy].” [12]The Franken Committee describes various data that will also become visible in traffic data due to ongoing technological developments, an example being that in using the internet not only traffic data is recorded that pertains to the telephone traffic between user and dial-in access point of the provider of internet services, but it is also registered which websites have been visited [13]. In the Franken Committee’s report no examples are provided of data that actually provides an idea of the content of the communications, such as text messages.

In the government’s response to the report it shares the position of the Franken Committee that traffic data does not fall under the protection of Article 13 of the Constitution.

“In the Committee and government’s view there exists insufficient justification to bring traffic data under the specific protection of Article 13. This conclusion is related to the fact that traffic data may tell much about persons in our information society, but that the same applies for much more sensitive data that do not fall under the scope of Article 13. No proper arguments can be put forward to make a distinction in the constitutional protection level between categories of personal data based on the fact that it is or is not related to a content that, independently, is subject to constitutional protection.” [14]The Review Committee agrees with the view that in itself traffic data does not necessarily fall under the scope of the specific protection of Article 13 of the Constitution. However, as the text messages included when current traffic data is sent do themselves contain confidential communication, these messages are not “related to a content enjoying independent protection”, but are in themselves a content enjoying constitutional protection. The fact that a text message involves confidential communication has been phrased by the government in its response to Parliamentary questions on the report as follows:

“Electronic data traffic between individual citizens via email and text messaging falls under the scope of the proposal for Article 13 of the Constitution because it concerns confidential communication. The protection of confidential communication is not limited to communication actively protected, for example by means of encryption of the message. As mentioned, the nature of the channel chosen, the (manner of) addressing and the nature of the communication may serve as a guideline in determining the confidentiality.” [15]The text messages included in a silent tap therefore, in the Review Committee’s opinion, fall under the protection of Article 13 of the Constitution. This is in line with what the government has also included in its position on the traffic data:

“Insofar as taking cognisance of traffic data coincides with taking cognisance of information concerning its content, content-related information is involved..This content-related information falls under the stricter regime of Article 13.” [16]

For wiretapping, interception, recording and listening in on (any form of) telecommunication – as a result of which the privacy of the telephone laid down in Article 13 of the Constitution is violated – a special provision has been included in the WIV 2002, namely Article 25, the application of which has been surrounded by extra safeguards as it is only the Minister of the Interior and Kingdom Relations who can give permission for the application of this method. The Committee is of the opinion that, as long as it is technically unfeasible to avoid text messages being sent along with a silent tap or to ensure that text messages are separated in the wiretap room, a silent tap falls under the description of Article 25 paragraph 1 WIV 2002, namely under ‘any form of telecommunication’, because as a result of text messages being sent along, cognisance is taken of the content of the communication. Therefore the Committee urgently recommends that the request for permission to apply a silent tap must be made to the Minister of the Interior and Kingdom Relations in the way set out in Article 25 WIV 2002.

[0-2] (not used)
[3] Parliamentary Documents II 1997/98, 25 877, no. 3, p. 41.
[4] Parliamentary Documents II 1999/2000, 25 877, no. 8, p. 65.
[5] Parliamentary Documents II 1999/2000, 25 877, no. 8, p. 45-46 and Parliamentary Documents II 2000/01, 25 877, no. 59, p. 7-8.
[6] Parliamentary Documents II 1997/98, 25 877, no. 3, p. 44.
[7] Parliamentary Documents II 1997/98, 25 877, no. 3, p. 45.
[8] Parliamentary Documents II 2000/01, 25 877, no. 14, p. 33.
[9] A full listing of the data that can be retrieved has been included in the Governmental Decree to Article 28 paragraph 1 WIV 2002, to be referred via http://wetten.overheid.nl.
[10] Parliamentary Documents II 2000/01, 25 877, no. 14, p. 35.
[11] See footnote 9.
[12] Report Committee on Constitutional rights in the digital era, May 2000, p. 159.
[13] Report Committee on Constitutional rights in the digital era, May 2000, p. 160.
[14] Parliamentary Documents II 2000/01, 27 460, no. 1, p. 27.
[15] Parliamentary Documents II 2000/01, 27 460, no. 2, p. 59.
[16] See footnote 14.

Related:

EOF

TorRAT: four Dutch suspects arrested in EUR 1M digital fraud and money laundering case

UPDATE 2013-10-29: added link to article on TorRAT by Tanya Shafir posted on April 22nd 2013.

On October 24th, the Dutch Public Prosecution Service announced the following:

Hackers plunder back accountsOctober 24, 2013 – Public Prosecution Service

Hackers are suspected of looting bank accounts and making hundreds of fraudulent transfers by installing malicious software on the computers of Dutch bank account holders.

On Monday, the police arrested four men from Alkmaar, Haarlem, Woubrugge and Roden on suspicions of involvement in large-scale digital fraud and money laundering case.

Fake email messages were sent containing a link that activates the so-called banking malware, giving the hackers access to the computers of unwitting account holders. It invading `TorRAT’ manipulates the online banking by adding, modifying or deleting data. The malware adds new payments, or changes existing payment orders without the account holder being able to see it.

TorMail

To protect their criminal activities the suspects made ​​use of TorMail, a free service that allows users to anonymously send and receive messages.

The fraudulent transfers have ended up in bank accounts of moneymules. They were recruited to make their bank accounts available or to open new bank accounts and handing off their credentials. To channel the stolen money, domestic and foreign companies were created and business bank accounts were opened.

Bitcoins

Moreover, the defendants exchanged money that was supposedly criminally obtained for bitcoins, a form of electronic currency. One of the men managed itself a bitcoin exchange service where (cash) money can be converted into bitcoins. The Public Prosecution Service seized 56 bitcoins, which have been exchanged for more than 7700 euros.

The police investigation focuses on the period from spring 2012 to the present, and on more than 150 fraudulent transactions. Several banks and companies have reported cybercrime. The extent of the damage is possibly around one million euros.

The suspects are taken into custody for twee weeks by ​​the magistrate in Rotterdam.

Related:

EOF

Mandatory Privacy Impact Assessments for Dutch govt IT projects

UPDATE 2015-03-17: US DHS also performs PIAs and has an excellent page on what a comprises a PIA in their context and how they carry it out; including guidance and template documents.

UPDATE 2013-11-03: regarding the EU-proposed PIA, the UK Deputy Information Commissioner David Smith stated: “It might sound scary, but it should help organisations to design systems that respect individuals’ privacy and so command the confidence of customers and the wider public.” That’s exactly right!

In 2011 the Dutch government published the “I-Strategy 2012-2015” document; see Trust, Privacy & Security in Dutch Govt “I-Strategy”. The I-Strategy describes the Rutte-cabinet coalition agreement on information strategy for the period 2012-2015. One of the topics covered in the strategy is that the requirements for the content of project plans (including legislative proposals) for large IT projects (26 643, nr. 135) would be “supplemented with the demand to state whether the project involves privacy-sensitive data and linkage or data enrichment. The project plan will state, with arguments, whether a Privacy Impact Assessment or a similar instrument applies.

As of September 1st 2013 in the Netherlands, a new rule applies to all `large’ (?) ICT proposals initiated by the national-level government entities (Ministries, etc.): ICT project proposals must now include a Privacy Impact Assessment (PIA). In June 2013, the Dutch administration proposed a PIA model (.pdf, in Dutch).

Within two years (before September 2015), it will be evaluated to what extent the PIA supports:

  1. improvement of legislative quality;
  2. realization of the I-Strategy.

For purposes of international comparison, discussion, etc., I hereby provide an English translation of the entire PIA model drafted by the Dutch government. For clarity of exposition I do not include the original Dutch text in-line.

Please send corrections / suggestions for improvement to @mrkoot (Twitter) or to koot at cyberwar dot nl (e-mail).

On July 9th 2013, the Dutch Senate responded (.pdf, in Dutch) to the proposed PIA model and points out that improvements are needed. First, here’s my translation of the Senate’s response:

The members of the Senate Committee for Security and Justice (V&J) received the letter dated June 22nd 2013 on the proposed model for Privacy Impact Assessments (PIA). This template is an elaboration on and implementation of the coalition agreement, the motion of Senator Franken et al. (EK 31051, D), the commitment to further development of a PIA (T01516) and the measures announced in the I-Strategy 2012-2015 to strengthen the attention to privacy in large ICT projects. The members of the committee have a few questions.

These members note that the proposed PIA model only covers the risk-identification part of a PIA is addressed. This is a good and appropriate step, but a full PIA must also covers the next stage. The proposed European privacy regulation [0] assumes a full PIA. Can the government explain how they see the relationship of the proposed Dutch PIA model with the PIA of Article 33 of the proposed European privacy regulation? The third paragraph of the proposed Article 33 requires that the PIA also includes an assessment of the risks to the rights and freedoms of data subjects, the measures envisaged to reduce the risks and ensure security, and mechanisms that ensure the protection of personal data and that proof compliance with this European regulation. These requirements are not automatically covered in the PIA model proposed by the Dutch government. These members would like to receive a response from the government on this point.

The proposed model is based primarily on the Dutch Data Protection Act (DPA) anyhow and new elements of the proposed European privacy regulation are not included, such as the requirement to apply principles of “privacy by design” and “privacy by default”. Although the text of the proposed European privacy regulation is not yet final, the expectation is justified that such mandatory application of these principles will also be included in the final text. Why has the government not been able to include such obligations in the PIA, and is the government willing to do so now?

Furthermore, the committee members questioned whether the government sufficiently considered the consequences of answering the questions in the PIA model. It is by no means in all cases clear what impact a particular answer has. For example, the first question of part II.1 reads: “Did you establish the specific purpose(s) for the intended processing of personal data?” This is an important question. However, the meaning of a “no”-answer is left open and no consequences are attached. Does this then mean that a risk is only identified? What are the consequences? The key questions seem to be designed to apply the current Dutch DPA, rather than to the examination of privacy risks of those whose personal data is processed. What is the government’s perspective on this? Is the government willing to make a next revision of the proposed model more in agreement with the intended purpose of a PIA, as well as with the requirements in the expected European privacy regulation?

The members of the committee are looking forward to your answers to these questions within four weeks. An identical letter was sent to the Secretary of Housing.

[0] 2012-01-25, European Commission, COM(2012)11 final, E120003, Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) (.pdf, in English)

The government did not yet respond to the letter, except for stating (.pdf, in Dutch) that it would not be able to respond with four weeks to due summer recess.

I will update this post when the government has responded to the Senate’s questions.

Finally, here is my translation of the proposed Dutch official PIA model to which the Senate’s criticism still applies. Hyperlinks and parts in [] are mine. (Note: Google Translate was remarkably accurate in translation of some paragraphs.)

WARNING: this is an unofficial translation.

Dutch national govt Privacy Impact Assessment (PIA) model

[FIRST DRAFT; JUNE 2013]

A. Introduction

1. What is a PIA?

1. A Privacy Impact Assessment (PIA) is a tool for identifying, in a structured and clear way, privacy risks in policy development, and the associated legislation or construction of ICT systems and construction of databases. The PIA model is specifically aimed at the [Dutch] national government and intended to be used in all areas of policy and in all areas of law.

2. The PIA is in the form of a test model / questionnaire. It includes both factual and technical questions and questions that are based on national and European legal requirements. It is aimed at establishing, at an early stage, attention to all parts of the intended processing of personal attention that require attention and elaboration.

3. A PIA is not voluntary survey. In particular, the questionnaire content is intended to be both direction-giving and corrective. In addition, the answering process as such should stimulate awareness of the various privacy issues that need to be considered when developing legislation and policy, and the development of ICT systems and databases in that context.

4. A PIA is direction-giving in the sense that the (exhaustive) series of questions may indicate relevant privacy risks that (perhaps yet) have not been identified in the early stages of policy or system development. If that is the case, the relevant question must be understood as a necessity to take these aspects into consideration.

5. A PIA is also corrective. By the order of the questions it will often be necessary to reconsider provisional answers to previous questions and consider an alternative (less privacy-invasing) solution. It will frequently happen that the considerations and decisions made at an earlier stage of policy or system development cannot be substantiated well enough on closer inspection due to the associated privacy risks.

6. Because of the direction-giving and corrective character of a PIA, filling in the questionnaire will often a dynamic process, where draft (policy) solutions or concept-functional system designs are gradually tightened.

7. A PIA should be used in addition to, and in coordination with, other tools for development of legislation and policy, and associated construction of ICT systems and construction of databases. Hence, a PIA does not replace these existing instruments, and is not intended to overlap with those.

8. If the PIA is performed in the context of developing policies that will result in legislation, the `Guidelines for alignment with Dutch DPA’, included in the IAK, must be used.

9. If the PIA is performed in the context of developing policy that (also) provision the construction of data files or the construction of ICT systems, attention should also be given to the control measures described in the `handbook portfolio for Dutch govt projects with a large ICT component’.

10. Answering the PIA questionnaire results in a written document.

2. When is the processing of personal data by the Dutch national govt, including independent administrative bodies [=Dutch “ZBO’s”], necessary? (and is a PIA relevant at all)?

1. Use of personal data, including use by the government, is in many cases a limitation of the fundamental right to protection of privacy (Article 10, paragraphs 2 and 3 Constitution, Article 8 of the ECHR, Article 8 EU Charter of Fundamental Rights).

2. Once this comes into consideration in the context of development of policy and legislation, and the associated construction of ICT systems and construction of databases, it must first be determined whether processing of personal data is necessary for the intended goal. This concerns both subsidiarity and proportionality.

3. With regard to the subsidiarity, the (pre-)question is: is it only through processing of personal data possible to achieve the desired policy outcome? Are there any practical or effective technical alternatives that do not intervene in privacy? (This may, for example, include considering not handling personally identifiable information for aspects of the proposal that only capture general trends or patterns.) If alternatives to processing of personal data with the same policy results are available, those should be chosen.

4. For the development of policy and legislation, the answering of the questions of subsidiarity of personal data processing can be done using the `Guidelines for alignment with Dutch DPA’, included in the IAK, on alignment with (international) (classical) fundamental rights.

5. If the (preliminary) finding is that alternatives to processing of personal data do not exist, it is important to use the PIA model. Thus, all questions related to proportionality of the processing of personal data are clearly mapped and solutions can be formulated that do not go beyond what is necessary to achieve the desired outcome. These may include, for example, differentiating measures (is the processing of same data needed for all aspects of the policy proposal?), or allowing the possibility of an “opt-out” to those involved in certain specific circumstances.

6. A PIA must thus be used as early as possible in the process of developing policy that provisions processing of personal data, whether or not accompanied by legislation or construction of ICT systems.

3. How should a PIA be used?

1. Policy initiatives and legislative initiatives within the national government to process personal data have many forms. On the one hand, it may be an entirely new database or system in which a new set of data for a new purpose will be processed. On the other hand, it may involve adding a new type of personal data to be processed in an existing IT system, or linking several existing databases or systems to achieve a new purpose. It may also involve new forms of distribution, exchange, disclosure and (multiple) use of data.

2. The PIA questionnaire was prepared for the entire spectrum of new forms of data processing. The privacy risks to be identified using the PIA questionnaire will however greatly depend on the nature of the policy or bill or proposed IT system or database. It will therefore differ from case to case which of the PIA questions must be answered.

3. It is not necessary to complete the full questionnaire when the following is involved:
– Expansion of the database within an existing IT system (it suffices to answer the questions in Sections I and IV)
– Using an existing database or ICT system for additional or new goals (it may suffice to answer the questions in section II and IV)
– Linking various existing databases or ICT systems for existing or additional or new  new purposes (sufficient answering the questions in sections II-V)
It goes without saying that in the performance of such a “PIA-light”, it is sensible to refer to any previous pieces (previous PIAs, other impact assessments, explanations).

4. In all other cases, given the relationship between the aforementioned questions, and the direction-giving and corrective nature of the PIA, the entire questionnaire should be completed.

5. The final answers to the PIA questions will have to serve as the basis and source for technical, policy-related and legal justification of choices (see further below at 5).

4. Who? Implementation and coordination

1. The PIA questionnaire must be completed by the staff members or legal drafter of the Minister who, or independent administrative body [=Dutch “ZBO’s”] that is or will be “responsible” for the processing of personal data within the meaning of the Dutch DPA.

2. “Responsibility” exists, in terms of the DPA, if this department of the national government is the entity that determines the purposes and means of the processing of personal data.

3. A PIA does not have to be performed by policy makers and legal drafters of the Ministry or the part of the national government that only acts as “processor” within the meaning of the DPA, i.e., if only at request of a responsible party. In case of uncertainty, please contact the legal department of your Ministry.

4. The Data Protection Officer (DPO) within your department is responsible for the independent supervision of implementation and compliance with the DPA. You can contact the DPO for advice when answering the questionnaire or on the results of the answer. The DPO can identify issues and help identify risks.

5. If your policy or legislative proposal relates to the construction of an ICT system or the creation of a data file, please also contact your departmental Chief Information Officer (CIO) at an early stage. The CIO gives an opinion at the start of a project or interim change, as stated in I-Strategy. Part of this is the examination of whether the project plan states whether the project includes the collection of privacy sensitive data or linking or enriching of data, and whether it is argued whether a PIA is required.

5. Use and accountability of PIA results

1. A seriously performed PIA will have worked direction-giving and corrective. Plans are focused and developed. This means that in the preparation of legislation, policies and government ICT systems, privacy aspects as such have become part of the deliberation process. Considering that adjustments based on decisions in the PIA process will already be included in the final answer to the PIA questions, only the final answers are used in the further development of policies and systems.

2. The considerations and choices reflected in the final answers will vary by legislative or policy proposal or IT system. To account for the final use of personal data, ??previous policy choices and solutions in other contexts will have to be referenced. In addition, new aspects or elements that deviate from the choices (e.g., more data than before, a different system than before, etc.), will require further consideration.

3. Results of a PIA should be sent to the involved DPO and the CIO. Depending on the context in which the PIA is performed, the results are processed in different ways.

4. Where policies are involved that provision the construction of ICT systems or the construction of databases, the DPO on this basis can provide advice in determining the necessary measures and safeguards to be set out in policies, instructions, manuals and procedures. In addition, the CIOs can use the results for advising on information security and system design. Also, the PIA results provide input for any notice of the proposed processing to the Dutch Data Protection Authority [=CBP] or the DPO, which are made ??public according to the relevant rules.

5 . In legislation, a passage is included about PIA results in the Explanatory Memoranda. It can be a summary of the main considerations and choices. This passage can be added to the already required considerations of the constitutional framework and the review of the Data Protection Act (see above, under A) . Although a fully standardized accountability section can therefore not be given, a model-element of the Explanatory Memoranda can be:
“Given the nature of this proposal, a Privacy Impact Assessment is carried out at the stage of policy development (see also Kamerstuk I 2010/11, 31051, No. D , motion-Franken) . Using this, the necessity of data processing is reviewed, and the implications of the measure(s) are identified in a structured manner. In particular, attention is given to the principles of data minimization and purpose limitation, the requirement of a good security, and to the rights of those involved. <Description of specific aspects and the judgments made in this case>”

B. Questionnaire

Processing of personal data has a strong legal framework. On the other hand, the text of the DPA is often experienced to be abstract and inscrutable. In this light, the questionnaire below contains both practical questions and questions of legal nature. The practical questions are meant to map the entire trajectory of data processing, and the agencies involve. When it comes to legal questions, the wording of the questions is crucial. In that case, it is attempted insofar as possible to explain this, and to add examples. If uncertainty exists about the content of the question, it is sensible to contact the DPO at your Ministry or the legal department.

I. Basic information: type of personal data, type of processing and necessity / data minimization

1. Do you, as responsible party, intend to use personal data for provisioned data processing? If so, what type of personal data?

Notes: Definition of `personal data’: any information relating to an identified or identifiable person (Art. 1 of Dutch DPA) .
Definition of `particular (sensitive) personal data’: data on religion or belief, race, political opinions, health, sexual life, trade union membership, criminal record; Cf . definition in Art. 16 of Dutch DPA: A representative is a natural person, legal person or any other person who or the governing body that alone or jointly with others determines the purposes and means of the processing of personal data.
Note: If your organization only acts as a processor (the one who processes data at request of the responsible party, without being subject to his direct authority), this questionnaire must be completed not by you, but by the responsible party.
Definition of `processing’: any operation or set of operations performed upon personal data, including at least the collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or any other form of making available, alignment or combination, interrelation, blocking, exchange or destruction of personal data.

2. Other specific personal data?
2a. Will data be processed about the financial or economical situation of those whose data is processed, or other data that might lead to stigmatization or exclusion?

Notes: this includes, for example, data about  (problematic) debts, gambling addiction, school performance, or problems at work or in a relationship.

2b. Will data be processed about vulnerable groups or persons?

Notes: this includes, for example, minors, mentally disabled, people who are dealing with stalking, whistleblowers or informants for police and prosecution.

2c. Will usernames, passwords and other credentials be processed?

Explanation: The possible consequences for those involved depend on the processing of personal data and where the credentials grant access to. It should be taken into account that many people reuse passwords for different purposes.

2d. Will uniquely identifying information be processed, such as biometric data?

Explanation: This type of data is not formally classified as `sensitive data’ in the EU Data Protection Directive 95/46 or in the Dutch DPA, but has become to be treated as such in the national and European legal and practical context. Pending European proposals for adjustment of data protection regulations continue this trend by categorizing the processing of biometric data as a specific risk.

2e. Will the SSN [=”BSN” in Dutch] or another personal number be processed?

Explanation: The Dutch DPA (Art. 24) provides that a law-prescribed number for identification of a person in the course of data processing is only processed for the purpose of implementing the law or the purposes determined by law. If necessary, refer the `Decree on use of SSN and Dutch DPA’ of 15 August 2001.

3. For each of the types of personal data specified in answer to question I.1 and I.2, can it be substantiated that its processing is technically or by policy directly relevant and indispensable for achieving the intended outcome of the policy? What exactly would not be clear if deciding not to process certain information? Explain for each type of personal data.

Explanation: The Dutch DPA provides the so-called `principle of data minimization’. Personal data may only be processed if a necessity exists (Art. 8). Art. 11, paragraph 1 also provides that personal data may only be processed if it is, given the purposes for which they were collected or subsequently processed, adequate, relevant and not excessive (relevance requirement). It is also important that the processing of sensitive personal data is, in principle, prohibited (Art. 16-23), and only permitted under strict(er) conditions.

4. When it comes to sensitive personal data, can the same policy effect or technical result be achieved in one of the following ways: (a) through (combined) use of normal data, (b) by using anonymous or pseudonymous data?

Notes: Anonymization means removal of all direct and unique identifying data. Pseudonymization means systematic replacement of directly identifying personal data, for instance by a code that permits certain authorized parties to still add data, but that does not allow identification of a person. This can e.g. be done by processing data through a specific algorithm directly after collection, where analysis and comparison remains possible but the source of the data themselves cannot, in principle, be traced.

5 . In what broader legal, policy and technical framework is the policy / database / information system to be developed and what kind(s) of  processing of personal data is going to be part of the planned trajectory? Are ?(new) technology or information systems used?

Notes: Enumerate all processing of personal data, and responsibilities, and clearly display the entire trajectory, for example by means of a visualization, so that the entire trajectory of data processing is transparent.

II. Purpose limitation, linkage, quality and profiling


Purpose limitation and linkage

1. Did you decide, in detail, on the specific purpose(s) for which you intend to process personal data? Is it one and the same specific purpose?

Explanation: The Dutch DPA (Art. 7) provides that personal data may only be collected for specific, explicit and legitimate purposes. For example, it can be indicated in legislation that personal data are processed for the limited purpose of combating illegal immigration. The processing must be justified based on grounds of the DPA (Art.). If multiple objectives are pursued by the collection of personal data, then those must all be made explicit, and for each objective it must be substantiated why the (entire) set of data is necessary to reach it.

2. Does the project/system involve the use of new data for an existing objective, or existing objectives within existing systems? (scenario of addition of new data).

Explanation: The Dutch DPA provides the so-called `principle of data minimization’. Art. 11, paragraph 1 provides that personal data may be processed only insofar they are, given the purposes for which they were collected or subsequently processed, adequate, relevant and not excessive. This means that if the data to be processed in an existing system is expanded, justification must exist for each of the new personal data to be processed. For a review of the data to be added, also see questions I.1-4 above.

3. Does the project/system involve the pursuit of new/additional objectives by using, comparing, sharing, linking or otherwise further process existing personal data, or collections thereof? (scenario of addition of purposes) . If so, do all persons/agencies/systems involved in processing have the same objective with processing of the personal data, or may tension exist, considering their position or interest? Do the same objectives apply to the entire trajectory?

Explanation: The Dutch DPA (Art. 9 , paragraph 1) states that data may not be processed (e.g. in the form of linkage or comparison with other data, or adding other data to achieve a specified goal) in a manner that is incompatible with the objective(s) for which they were obtained initially. Walk through the entire planned trajectory of the personal data, and state for each part whether an objective exists other than the objective for which the data was collected.

4. If you have answered positively to questions II.2 and II.3, how is such intended use (i.e., addition of new personal data to existing systems or use of existing personal data for new purposes) reported to: (a) the DPO, or (b) the Dutch Data Protection Authority [=”CBP” in Dutch] if there is no DPO?

Explanation: The Dutch DPA (Art. 62) provides the possibility to appoint a DPO. This officer supervises the processing of personal data. The supervision by that official extends to the processing of personal data by the controller who appointed him. The officer may make recommendations to the responsible party for better protection of the processed data. According to Art. 27, paragraph 3, planned processing of data shall be reported to the DPO. If no DPO exists, this should be reported to the CBP.

5. If you have answered positively to questions II.2 and II.3, what (further) controls are foreseen on such use (i.e., use of new data in existing systems or use of existing personal data for new purposes)?

Notes: see notes to questions II.2 and II.3. An example could be the planning an internal review, or an external evaluation.

Quality

6. Which periodic and occasional checks are provided for examining the correctness, accuracy and timeliness of the data processing foreseen in the policy proposal, the bill, or IT system?

Explanation: The Dutch DPA (Art. 11, paragraph 2) states that measures should be taken to ensure that personal data are correct and accurate, given the purposes for which data were collected or further processed.

Profilering

7. Will the collected/processed data be used to identify and/or asses and/or predict the behavior, presence or performance of people? Are the subjects whose data is processed aware of this? Do the data originate from different (possibly external) sources, and were they originally collected for other purposes?

8. Does this analysis/assessment/prediction involve the use of technically automated comparison of personal data (i.e., is it not performed by humans)? If so, what procedure is in place to ensure that concrete action is only taken after the intervention and (second) control of (human) staff?

Explanation: The Dutch DPA (Art. 42, paragraph 1 ) states that no one may be subjected to a decision that carries legal consequences if that decision is taken solely on the basis of the automated processing of data intended to evaluate certain aspects of his/her personality.

III. Relevant authorities/systems and responsibility

1. What internal and external body/bodies and/or systems is/are involved in the data processing foreseen in each of the various phases identified under I.5? Which providers are there, and which recipients? What files or partial files, and which infrastructures?

2. Is it clear, at every stage, who is responsible for the processing of the personal data? If yes, is this person or organization adequately prepared and equipped to respect the necessary provisions and measures, including resources, policies, responsibilities, procedures and internal control?

Explanation: The Dutch DPA (Art. 1d) appoints as responsible party the natural or legal person or any other person or the governing body which alone or jointly with others determines the purposes and means of the processing of personal data.

3. Who exactly within your organization, and each of the other organizations involved, get access to the personal data? Is it possible that use of the data could result in the data becoming accessible to unauthorized parties?

4. Does a restriction apply to one or more of the authorities involved on the ability to process data due to confidentiality obligations (related to function/law)?

Explanation: The Dutch DPA (Art. 9, paragraph 4) states that the processing of personal data is omitted insofar as a duty of confidentiality by virtue of office, profession or legal provision is in the way. Such a duty of confidentiality may for instance apply to physicians and (youth) aid workers.

5 . Are all stages of processing, meaning the data types and exchanges, mapped or possible to be mapped, such that to those whose data is processed it is clear who, why and how personal data are processed?

Explanation: The characteristics of the processing should always be available as a condition for being “in control” as a responsible party, in particular with regard to the notification and information obligations of those whose data is processed (Art. 27, first paragraph, and Art. 30, paragraph 3 of Dutch DPA).

6. Are policies and procedures foreseen that provide for the creation and maintenance of a collection of the personal data that you want to use? If so, how often and by whom will the processing be monitored? Does the processing include a collection that is performed on your behalf (e.g. by a subcontractor)?

7. Is there a transfer of personal data to a (government) agency outside the EU/EEA involved? Does this country have an adequate level of data protection as decided by the European Commission or the Dutch Minister of Security and Justice? Are all or is a part of the data transfered?

Explanation: The Dutch DPA (Art. 76) provides that personal data may be transferred only to a country outside the EU/EEA if that country ensures an adequate level of data protection. As for the U.S., the European Commission states that organizations that have obliged themselves to comply with the so-called `safe harbor’ principles are also supposed to guarantee adequate protection. A complete list of Commission’s decisions on the adequacy of protection in other third countries (such as Israel, Argentina and Australia) can be found at the following website: http://ec.europa.eu/justice/data-protection/document/international-transfers/adequacy/index_en.htm

IV. Security and retention/destruction


Security

1. Is the policy on data security in your organization in order? If so, who/which department(s) is/are responsible for making, implementing and enforcing of that policy? Is this policy specifically focused on data protection and data security?

Explanation: The Dutch DPA (Art. 13) requires that organizational and technical measures are taken to protect against any form of unlawful processing of personal data.

2. If (a part of) the processing takes place at a processing party, how you will ensure the data security, and supervision thereof, at that processing party?

Explanation: The Dutch DPA (Art. 14, paragraph 1) requires the responsible party to ensure that a processing party, if it takes upon itself (a part of) the processing, takes sufficient technical and organizational measures. In accordance with paragraph 2, a processing contract must be drawn up. Based on the DPA, compliance with the measured must the supervised (Art. 14, paragraph 1) .

3. What technical and organizational security measures are taken to prevent unauthorized or unlawful processing/abuse of (a) data that exists in an automated format (e.g., password protection, encryption) and (b) data that are recorded manually (e.g. putting locks on cabinets)? Does a higher level of protection exist for protection of sensitive data?

Explanation: To determine the appropriate level of risk, see the “Guidelines for Security of Personal Data”, 2013, at: http://www.cbpweb.nl/Pages/pb_20130219_richtsnoeren-beveiliging-persoonsgegevens.aspx

4. What procedures exist in the event of breaches of security regulations, and to detect such breaches? Does an emergency plan exist to deal with an unforeseen event in which personal data are lost, or exposed to unlawful processing?

Retention/destruction

5. How long will the data be stored? Does the same retention period apply for each of the types of personal data collected? Is the project subject to any statutory/sectoral requirements regarding retention?

Explanation: The Dutch DPA (Art. 10, paragraph 1) states that personal data are not kept in a form which permits identification than is necessary to achieve the purposes for which the personal data are collected and processed.

6. Which policy-related and technical reasons require this storage period?

7. What measures are planned to destroy the data after the retention period expires? Are all personal data, including log data, destroyed? Is the destruction supervised, and by whom?

V. Transparency and rights of data subjects

Transparency

1. Is the purpose of the data processing known to those whose data is processed, or can it be made known? What is the procedure for informing the subjects, if needed, about the purpose of the processing of their personal data?

Explanation: The obligation of transparency provided here is distinct from (and is in addition to) the legal knowability requirement (reporting on the purpose of a data in legislation itself). The purpose of this transparency obligation is to inform subjects about the processing at a place/time related to the (proposed) processing. For example, does the form include information about the purposes of the data collection? Or are roadside signs provided that announce video surveillance?

2. If you obtain the data directly from the data subject, how do you inform them about your identity and the purpose of the processing prior the processing it?

Explanation: The Dutch DPA (Art. 33) lays down specific rules for this type of notification to subjects. The transparency obligation referred to here is distinct from (and is in addition to) the legal knowability requirement (reporting on the purpose of a data in legislation itself). The purpose of this transparency obligation is to inform subjects, whether or not at their request, at a place/time related to the (proposed) processing.

3. If you obtain the personal data via another (government) organization, how will the data subjects be notified about your identity and the purpose of the processing at the time of processing?

Explanation: The Dutch DPA (Art. 34) lays down rules for notification of subjects. The transparency obligation referred to here is distinct from (and is in addition to) the legal knowability requirement (reporting on the purpose of a data in legislation itself). The purpose of this transparency obligation is to inform subjects, whether or not at their request, at a place/time related to the (proposed) processing.

Rights of data subjects

4. If you ask the subject’s consent for processing personal data (opt-in), can the person revoke his/her consent at a later time (opt-out)? In case the subjects refuses or revokes consent, what is the implication for that person?

Explanation: In accordance with the Dutch DPA (Art. 8, paragraph 1 ), unambiguous consent of the data subject is one of the possible justifications for processing personal data. Such consent must be specific, informed, and given freely.

5 . What procedure exists for the data subjects to ask the responsible party to inform them whether their personal data is processed?
How are third parties, who may have objections to providing that information, given the opportunity to give their view on this?

Explanation: The Dutch DPA (Art. 35, paragraphs 1 and 2) provides the data subject the right to freely, and with reasonable intervals, ask the responsible party to state whether their personal data are processed. The responsible party must inform the subject within four weeks. Article 35, paragraph 3 states that third parties who may have objections to such a notice, must be allowed to give their views in advance  unless this would require a disproportionate effort.

6. How can a request from a data subject for correction, addition, deletion or blocking of personal data be treated?

Explanation: The Dutch DPA (Art. 36) provides a right of correction or blocking, and also a right to object to processing in connection with special personal circumstances (Art. 40).

EOF