Notes from a CIO: Security & compliance in today’s dynamic regulatory environment
Digitization has resulted in a lot of good things. It has facilitated better processes, brought greater transparency, and accelerated the exchange of information. Further, social networks not only enable us to connect with other people and advertise but also to find a job.
However, people put their lives on social media platforms, and companies use all that data, which begs the question, “Who is responsible?” Is it the individual who is sharing the data, the company that uses it, or perhaps the government because of a lack of regulation?
According to Dragoljub Nesic, the CIO of Freja eID, a significant amount of responsibility lies on the individual, but companies and the government also bear part of it. “Unless data privacy is the company’s business like it is in the case of Freja, there will always exist a tension between profitability and everything else, including temptations to use and to abuse end-user data,” he says and adds that the concept of a national government is ill-matched considering the global nature of the internet from the cultural perspective of the Western world.
In this episode of Sovereign DBaaS Decoded, Dragoljub explains how much control an individual has over their data, what happens to it, and how Freja can protect data privacy. Dragoljub and our host Vinay Joosery discuss individual data sovereignty, how our data is used, who is responsible for the data, and how the regulatory landscape has evolved to address data sovereignty concerns.
Key Insight Summaries
The Illusion of Privacy and Data Over-Sharing: We often release significantly more data than we realize through “passive” interactions, such as how long we dwell on a social media post or the background details in a photo of a pet. Machine learning now allows services to discern specific details from images that users never intended to share. Furthermore, even “incognito” browsing is no longer a guarantee of anonymity, as researchers have found ways to deduce identities even in those setups.
The Conflict Between Profitability and Privacy: For most corporations, data privacy is not the core business; survival and growth are the primary goals. This creates a natural tension where companies are tempted to use or abuse user data to stay competitive. Neša argues that proactive privacy measures are usually only found in companies, like Freja, where “trust” is the actual product being sold.
The Inadequacy of Consent (Privacy Policy Fatigue): While GDPR requires users to be informed, the sheer volume and complexity of privacy policies have led to “fatigue”. Most users click “accept” out of necessity or stress—such as when trying to pay for parking while late for a meeting—without understanding the long-term implications of the data they are handing over.
Episode Highlights
[02:05] The Concept of High-Level Digital Identity: Neša explains Freja’s certification at “Level of Assurance 3” in Sweden and its status as the only Swedish digital identity notified for use across other EU countries under eIDAS.
[05:10] Breaking the Bank Monopoly: The founders of Freja launched the service to provide a transparent alternative to the Swedish electronic identity market, which was previously a non-transparent monopoly operated by banks.
[09:05] Data Generation vs. Human Connection: A discussion on how platforms like Facebook started as tools for connection but evolved into “massive data generation machines” that track every click and scroll.
[14:38] The “Margot Wallström Rule”: A regulatory requirement in the Swedish trust framework ensuring that no single person, regardless of their role, can circumvent security to issue a digital identity on someone else’s behalf.
[20:30] The “Internet Never Forgets” Warning: Neša highlights the lack of an “original negative” on the internet; once a compromising photo or data point is uploaded, it exists in innumerable copies forever, a fact often misunderstood by younger users.
[33:53] Banking Law vs. GDPR: An explanation of how the Swedish Banking Law can “trump” GDPR, allowing law enforcement to ask broader, sweeping questions of banks than they can of non-financial institutions like Freja.
[37:30] Protecting Public Servants via Organization IDs: A deep dive into why social workers use “Organization IDs” to sign sensitive documents (like foster care placements) to prevent their personal “civic registration numbers” and home addresses from being leaked to the public.
Here’s the full transcript:
Vinay Joosery: Welcome to our fourth episode of Sovereign DBaaS Decoded. I’m Vinay Joosery, CEO of Severalnines, and this episode is brought to you by Severalnines. Severalnines, we bring database automation software to enterprises. So, today’s podcast is about individual data sovereignty in today’s digital environment. And to tackle that, we have a CIO joining us, Dragoljub Nesic aka Neša, CIO of Freja e-ID. Thanks for joining us, Neša. How are you doing today?
Dragoljub Nesic: Well pronounced, Vinay. Well pronounced. You’re one of the few that can do this. Great, actually. I’m on the tail end of a disastrous man flu. So, as of yesterday, I wasn’t sure I’d be able to do this, but fortunately, it kind of tailed off. So, if you find me sneezing occasionally during this conversation, then you know what’s the cause.
Vinay Joosery: All right, all right. Well, Neša, our audience is quite international. Freja e-ID is a Swedish company. Can you tell us a bit about Freja e-ID and your role there?
Dragoljub Nesic: Sure. Freja is, in essence, a digital identity and an associated ecosystem of services that support its use. From a Swedish perspective, it’s been certified to what is called Level of Assurance 3 by the Swedish Agency for Digital Government, and it has also been notified at what is called eIDAS level substantial as the only Swedish digital identity for use in other EU countries.
Now, if I translate this into a regular user, what it means is we support authentication and signatures in both the digital and physical worlds, and both for private use and as part of an employment. So, if I, for example, need to represent myself to another business, or local authority, or a government agency, I can use the private part and share my name, surname, and in Sweden, quite commonly the civic registration number, or the personnummer as it’s called here, with the service that I’m logging into.
However, if I want to represent or access the same services representing my business, I can choose to share my employment ID rather than personal data. And if we have time, we can touch upon that later on. And I guess this sharing of information is pertinent to the discussion we have today. The concept of an individual’s control over data has been a cornerstone of the service that we’ve set up for our users since we launched it five years ago. In the company manifesto, we have vowed not to share the data of our users without explicit consent, each and every time, period.
So, what do I do at Freja? I’m, as you said, the CIO. I’m also one of the founders. And as the CIO, I’m responsible for development, operations, and compliance of the Freja platform. So, a lot of my time is actually spent on questions of data privacy.
Vinay Joosery: Okay, thanks, thanks. So, today’s topic, individual data sovereignty, how our data is used, who is responsible for the data, how the regulatory landscape has evolved to address data sovereignty concerns. So, what do we see is, it’s quite interesting where, you know, as part of your company manifesto, you actually decided not to just hand over everything, but you decided to side with the user, in a way. May I ask you why that?
Dragoljub Nesic: Well, we are in the business of trust, so we need our users to trust us, and launching Freja, we did that because we saw there is an awkward situation in the market in Sweden, effectively a monopoly operated by the banks, and that is a very nontransparent setup. The user has very little control over what’s going on in terms of sharing their data. And we decided two things. One is to try to break the monopoly, introduce a competitive environment for electronic identities in Sweden, and then in the neighboring countries, and essentially in Europe, but also to, as you say, be on the side of the users and let the end user decide what data is shared and where. I mean, it’s perfectly okay that a service asks for some personal data, but you as a user should have control over when and if that is shared, or whether you’re willing to release that data rather than the service deciding that on your behalf.
Vinay Joosery: Yeah, yeah. And that sort of brings us to this sort of situation today. I mean, we talk about digitalization of society, more data is created in multiple places. You know, we have Facebook, we have TikTok, Instagram, and so on, and this brings questions on ownership of the individual data. This is a brave new world. Individual, you know, sort of private individuals… you know, as private individuals, we’re not able to trace what happens to our data, or how it’s shared, or even sold. So, how did we get here?
Dragoljub Nesic: Oh, well, as with most things in human history, I don’t think anyone really understood neither the benefits nor pitfalls of digitalization at the onset, say, 30 years or so, when it really accelerated. Some benefits were very obvious, and I frequently take the Swedish tax office as an example. The process of paper tax returns was so tedious, so error prone, and so slow, that the tax office very early became adopters of digitalization. And the results are fantastic.
Automatically filling like Field A21 on page form, or form X24 based on the calculation result presented in Field S12 of form P17. I mean, it’s a huge leap forward in getting the tax returns correctly, quickly, and effectively, because you can fix all those things online in real time rather than sending paper copies back and forth, paying for postage, and taking eons of time. So for services such like that, the benefits of digitalization were very clear at the onset as a way of providing a better service to the citizens at a fraction of the cost that they previously took to process those returns. If you take a look at some other services like Facebook, I don’t think the guys in the dorm that invented it really grasped what was to come, regardless of how this is romanticized in Hollywood movies.
I mean, in terms of data, I don’t think most people, even to this day, understand the amount of data that they share directly or indirectly, voluntarily or involuntarily. Sort of, you like an image of a dog on Facebook, well, we can conclude that you like dogs, right? You published a photo of a pet dog, you really like dogs. You published a photo of the same pet dog again, oh, you are a dog owner. And that is a data point about you, which you just released to Facebook, and by the virtue of the fine print to anybody they deem should get a hold of that data. And if we just hold on to this image kind of area for a second, the field of machine learning has made unbelievable strides in the past 20 years or so. So now, if you publish this image of a pet dog, one can not only say that it’s a dog on the image, the service can discern the exact species of the dog and the elements of the background and so on. So you released even more data without being aware of it.
And then there’s also things that one doesn’t do. Like you scroll through a flow of something on a social network and you don’t spend much time over a post or don’t click on something, that is also a data point that the service collects from your behavior. So, what started as a way of connecting people to other people turned out into a massive data generation machine, in essence. And, you know, we can look at a similar service, say, search engines. Finding information amidst all the data out there on the internet is a very difficult task and a very productive task for humans. Many years ago, you could look up something in an encyclopedia in total privacy. The process was slow, but the answer you got was reasonably accurate, although potentially depending on how old the encyclopedia is, or was, outdated. But the search itself was very private. It was very difficult for someone else to figure out what you were looking, what you were searching for in this encyclopedic lookout.
Nowadays, with the onset of the search engines, the results come instantly. The drawback is that they contain much more noise. It’s not as easy to discern the quality of sources on the internet. And moreover, in terms of this data leakage that I started talking about, every time you search for something, you reveal something about yourself. It kind of reminds of these spy games from the 70s, where the Soviets and the US were discussing nuclear weapons and so on. The questions you ask, but also the questions that you don’t ask and how you ask them, reveal something about what you know and don’t know.
And then the same applies here. And there’s various ways that people try to get around these privacy issues. You know, people started using incognito browsers to try to minimize the data leakage and increase privacy. And what’s the bad news here? Well, as just a couple of days ago, I’ve read an article where researchers have found a way to deduce who a user is, even in an incognito setup. Something that is really of a huge concern for individuals that operate in oppressive regimes and hostile environments. So, to answer your question, actually, after a very, very long introduction, we ended up here by sheer progress. You know, like it or not, the world is not going back. And, combined with the fact that we, humans, often do things without fully understanding the consequences, we are bound to see, you know, magnificent developments, but also setbacks, in terms of data privacy, looking forward.
Vinay Joosery: Yeah, and I guess that’s where, typically, regulation comes in to actually, you know, sort of regulate how the data is used and things like that. So, digital identities is quite a new thing, right? It’s been around for a couple of years, and now it’s being actively used in society. So, can you tell us, like, you know, Freja manages digital identity and all the data around it, right, on behalf of individuals and companies. Can you tell us something about the regulations that the data falls under?
Dragoljub Nesic: There are quite a few, but if I think of it, we could distill it down to the three most important. As I mentioned in the intro, Freja is certified by the Swedish Agency for Digital Government, also known as DIGG, and it’s been awarded what is called an approved Swedish e-identity for both private and employment use at Level of Assurance 3, and also notified for European use at eIDAS substantial level. The award itself of this postmark was done through an audit against the trust framework, which governs the issues of the identities in Sweden and eIDAS. And, in its essence, the trust framework requires issuers to base their operations on a risk-based information security management system, which is audited annually by independent auditors and a few other things. But we are obliged to assess the risks in everything we do from operational risks, financial risks, data management risk, employment risks, and so on, environmental and whatever else, and then drive information security work around that.
So, if I take an example of one thing, is that the auditors… there’s a stipulation in the trust framework that no single person should be able to circumvent the security of the setup and issue an electronic identity on someone else’s behalf. For whatever reason, this is called a Margot Wallstrom rule. I’m not sure why, but probably that she was the Minister of Foreign Affairs when the trust framework was written. So, we have ample controls in place to make sure that no single employee with, you know, error or by malicious intent can issue an identity on behalf of anyone. That’s the, kind of, first regulatory area. The second, obviously, is GDPR. We are operating in that environment.
And I think we were quite lucky there that, when we decided to build and launch this electronic identity, GDPR was already decided. So we had the luxury of designing the system from ground up with GDPR in mind, and we took great care to make sure we handled the user’s data appropriately, with relevant information, and Terms and Conditions, and privacy policies, and risk analysis about how this is handled. And then, also having adequate clauses and contracts with both suppliers and customers that govern the transfer of any personal data.
And moreover, I’m very proud of the culture that we’ve built within the company. You know, every time a new idea pops up that we want to develop or somebody asks to be developed, one of the very first questions addressed is whether that has any impact on data privacy in terms of what, if any private data is collected or stored, how it will be protected, processed, and how do we inform the end users. So that is strongly interweaved in the DNA of each and every employee at the company, from programmer to designer, to product owner and salesperson, to think about the data privacy perspective from all the possible and relevant angles.
Vinay Joosery: This is very uncommon because, usually, you know, whenever you have innovation, whenever you have new ideas, it’s: ‘How is it going to make more money?’ I mean, usually that’s kind of the default reflex, I would say. You know, anything you do has to be motivated by money, payback and things like that, usually.
Dragoljub Nesic: Well, I won’t say we don’t take that into account, but it’s definitely also a question that we ask ourselves. Is it worth putting efforts into this? But the identities and data is our core business. We cannot afford to do something foolish in the area of data privacy. So we take really great care of that. And I think… the final part of the regulatory environment is Freja is a listed company on NASDAQ in Stockholm, at the OMX, so we need to comply with NASDAQ’s code of conduct for listed companies, to IFRS accounting, and a set of rules laid by the Swedish Finansinspektionen or Financial Inspection Agency governing things like insider trading, information release.
And similarly to GDPR, we have actually instilled a culture where any initiative change or communication, actually, press release, whatever it is, takes into account the need to create what is called a logbook, so we can trace things like insider trading, or whether the market needs to be informed, and if so, whether how. So, we actually have a change control process before we deploy any functionality as one of the checks is, you know, can this affect the share price in any way? And if it can, have we actually informed the market through correct channels before releasing the functionality and so on?
Vinay Joosery: Okay, so that’s a great approach to see. And then, if we look a bit at what’s happening in terms of data and individual data… I mean, people are putting their lives on Facebook, right, or on all these different social media platforms. And who is responsible in a way for that? I mean, because at the end of the day, companies are taking advantage of all that data that’s being put out there, but who is responsible? Is it the individual? Because the company didn’t force you to put the data there, right? So is it the corporate’s responsibility or is it government, maybe, because of lack of regulation?
Dragoljub Nesic: I wish there was a simple answer to these questions, but I don’t think there is, if there’s any answer at all. I guess, my take is that a large responsibility lies on the individual. The simple fact that the internet never forgets is not appreciated or understood enough, I’m afraid, particularly amongst the young. And I have no intention of blaming the youngsters, not at all. I mean, being young. one should be allowed to act foolish, to take bigger risks. That is just a part of growing up.
The problem is that many years ago, when I was a foolish youngster, most things would quickly go forgotten. Foolish things committed on the internet today never do. They stay there forever. So, you know, how many movies have you seen where somebody is blackmailed with photos and is only at peace when the original negative is retrieved? On the internet, there is no such thing as the original negative, just innumerable copies. So if you don’t want a compromising photo haunting you for the rest of your life, don’t upload it anywhere. And I’d actually go as far as saying, don’t take it at all. But I guess that’s too much of a ask.
In terms of companies, you know, one describes various stakeholders like the owners, management, employees, customers, partners, environment, public, and so on. However, at the end of the day, as you pointed out earlier, very few companies can exist perpetually making losses. So, surviving and growing is therefore by necessity the primary goal of any company. So unless data privacy is the company’s business, like it is in the case of Freja, however cynical it may sound, I argue there will always exist a tension between profitability and everything else, including temptations to use and to abuse end-user data.
And then, you know, in terms of the regulator, the governments of the world, I guess the concept of a national government is a very ill match for the global nature of the internet, at least from the cultural perspective of the Western world. I mean, some countries have very strong controls of what is done on their networks, by whom and how. So there, the government can and does have much stronger say on how data is handled, with obvious downsides, of course, as most of these environments can hardly be described as democracies as I would define them. So, as long as democracy and openness are fundamental societal values, the government’s influence is limited and, moreover, fragmented.
If you take a simple example, there is this app that kind of ages people’s photos on the net. What possible governance can the Swedish Agency for Data Protection exercise over that service if it’s offered from somewhere outside the EU? Literally none. And as a democracy, I don’t think we are ready to bar such services from the Swedish networks. The possibility of influencing that or checking how those photographs, which are uploaded for fun, are handled is very, very limited. Having said that, on the positive side of regulation, at least for businesses which operate with legal entities inside the EU, GDPR has brought data privacy to the agenda of pretty much every company that operates in any of the European territories.
Vinay Joosery: Yeah, yeah, and in your case, if that company would do business or if they would have some kind of entity in the EU, then there would be repercussions.
Dragoljub Nesic: Exactly, exactly. But if you have a company somewhere outside the EU or a web server offering a service that the Swedish users like and don’t really think about the consequences of uploading data, there would be very limited tools to enforce any kind of scrutiny on that service or check, you know, that whether data is handled correctly or not abused in any way.
Vinay Joosery: And I guess, when we come to this, you know, personal responsibility, I mean, you can only have responsibility if you have power, right, and if you’re the owner of the data. So, in the case of GDPR, you do have rights, right? You have the right to be forgotten. You have the right to know what the company knows about you, you know. And this brings us to individual consent, right? In a way, the individual, they have given consent, right, to the company, to make use of their data. So in a way, what’s the problem? I mean, you have these terms and services that you have to click when you create an account, and user consent. In a way, is that only enough, right, when it comes to handing over the power to the corporate?
Dragoljub Nesic: So let me ask you a quick question back. When was the last time you actually read a Privacy Policy for a service?
Vinay Joosery: That was a long time ago.
Dragoljub Nesic: That’s exactly my point. Most privacy policies that one gives consent to are extremely dry and incomprehensible. Yes, GDPR stipulates that you need to inform the user in a way that they will understand and can grasp and so on. And I think those privacy policies are getting better. But there are just so many of them. You know, I argue that we have a Privacy Policy fatigue there. Most people, guilty as charged here, as well, just cannot be bothered, with notable exceptions. I, personally, know a guy who actually asks Google to erase everything they have on him every month. Most people simply click on the Accept button and go on.
You know, just imagine the following situation. You’re late for a meeting and you need to pay for parking. In Sweden, nowadays, most parking is paid with apps. Unfortunately, there’s not one app to rule them all. And in several occasions, in the past year or so, I was myself in a situation that I need to download a new app to pay for this particular parking. So, you’re late for a meeting, you need to pay for parking, you find out, I don’t have the app. So, you download the app. What does the app do? It asks you for your email. Ah, yes, enter email. But you need to click on the link to confirm the email. Fine, do that as well. Time is ticking, you’re stressed.
Then you enter a mobile phone number, and then you need to confirm the mobile through an SMS. And then you enter the credit card details. And then you enter the Privacy Policy. I mean, by that time, when the Privacy Policy, whenever it was presented to you, you’re even later for the meeting, and you’ve lost the will to live because this process of registration is so tedious. So you’ll accept anything just to pay the bloody parking and get on your way. And in this way, the user will accept things, and very few of us will, after everything is settled down, go back and check, what did I actually accept earlier on? You’ll just find it’s just another parking app and so on.
And by the way, this service of on-boarding and check-in thing is something that we, with Freja, do really, really well. You know, one button, we tell the user, by the way, you know, this service wants your email, mobile phone number, and your whatever else details that you’ve previously registered with us. Do you agree to share them? So, at least the amount of stress of entering and confirming this in a garage with bad coverage, 4G or 5G, is much smaller and faster. So, what I actually want to say is that most services require users to give consent to some form of Privacy Policy, but also that most users don’t actually bother to read them. And even those that do very frequently have difficulties grasping the implications of what this policy actually says.
Vinay Joosery: Yeah, basically you need a lawyer pretty much in a way.
Dragoljub Nesic: It’s written by a lawyer, so it’s probably legally correct. And you probably need your own lawyer to understand. And yes, there are notable exceptions. We have users now when they’re contacting us asking questions about our policy, and it’s a welcome scrutiny. And we do have a policy which really, in simple terms, tries to explain what we store, why we store it, how do we share it, and under what circumstances, and how you can be forgotten and everything else. But it still is like two pages of fine print, and we’re just one of the services that one uses it. And if you add all the parking apps, all the other T’s and C’s, pretty much every app you have on your phone is likely to have presented a Privacy Policy and a Terms and Conditions, and then most websites and so on. It’s just, it’s fulfilling the legal requirements, but it’s causing enormous fatigue and difficulty to understand.
So personally, I have a very cynical approach to that, in that I assume that the data I share can and probably will be abused in some odd way. So, take an email as an example. I use one mailbox for practically all shopping accounts, memberships, and so on. I don’t trust the provider. I know they can, and they probably do scan everything I do, everything I order. And based on that, they have a fairly good idea that I’m around 50-ish. I like music. I like bicycles and triathlons, and things I do. But, you know, I’m not so concerned about it.
And, you know, in a perfect world, I would really love my purchase patterns to remain my own thing. But since I’m not willing to use cash, and to go around physical stores to buy stuff, I accept the consequences. In my kind of security world, I’ve got bigger fish to fry. And by that, I mean that when it comes to personal email, the emails I send to friends, families, and the valuable stuff, I don’t use that provider. I use a different one. One that I trust more with respect to the privacy of my data. And that’s the mails I don’t want somebody’s eyes prying on. Again, it’s a personal choice. I trust one provider over another. I might be wrong, but hopefully I’m not. And it’s something each and every one of us should think about in terms of how you handle your data to the services you provide it to. You know, there’s just so many things that having everything under control is beyond my capability.
Vinay Joosery: And in a way, it’s not a great testimony of, you know, with the world, the environment we live in, you know, we know that we cannot trust the providers in a way. And that’s sad. And even governments, right? Can you trust your own government? I mean, governments, they regulate, but they are not only regulators, right? They also are consumers of the data. I mean, law enforcement, right? They tap on people’s activities or whatever, right? And is there a conflict of interest there, right? I mean, how do you, as a corporate, mitigate the risk from an increasingly assertive enforcement?
Dragoljub Nesic: Yeah. I mean, apart from, you know, the law enforcement consumers, as you describe it, it’s worth pointing out that governments are also huge consumers of personal data in their day-to-day operations, the tax office, the healthcare, all the local services you provide. But nevertheless, it’s a very interesting question. To what extent the conflict exists really depends on the level of democratic maturity and core values in society.
You know, honestly… And I’ll take this as an example for good and for bad, and maybe somebody might not agree with me, but I don’t know how Chinese companies mitigate these risks, or even Chinese citizens. There is this legislation that requires Chinese citizens, and here comes a quote, ‘to provide needed support, assistance, and cooperation,’ end quote, to state intelligence organs. That is a very uncomfortable position to be as an individual. In essence, it says, you know, whether you’re living in China or you’re living abroad, you are obliged to assist the intelligence agencies if requested to do so, according to the other laws of the country and so on.
But that’s a very, very tricky position to be in as a private citizen, at least from the view of the Western world. It probably is much less questionable in China, but from my perspective, it is very, very sensitive. If we look at a law enforcement side, being an issue of electronic identities, we frequently get questions from law enforcements about activities of end users. And I think they frequently get surprised by the level of specificity we ask them in the queries before we release anything.
And I think the reason they get surprised is banks in Sweden operate under the Swedish bank law, which, if I’m not mistaken, in chapter 11, obliges the banks to collaborate with law enforcement agencies and share all available data upon request. And this law trumps GDPR. So, as a user, you cannot complain about this because the law on banking is a reason for sharing your data without your consent to law enforcement agencies. You know, there’s an e-ID scheme operated by the banks in Sweden. It falls under the same law. And therefore, the law enforcement agencies are allowed to ask sweeping questions about the use of such identities in case there is an ongoing investigation.
Freja is not a financial institution. So we have a much higher threshold to release info. So when the law enforcement agency comes to us, we don’t allow questions like, give us a log of activity for this specific user for a month. We actually ask them to say, you know, we suspect that person X has logged to service Y at this date and time. Can you confirm that or not? And I think the police understands or the law enforcement agencies, most frequently the police, understand our position. And it also shows a good example of how various laws can interact in delivering some of the information to law enforcement agencies. If I summarize, on one side, there is a legitimate need of crime prevention, prosecution, and intelligence agencies to effectively do their work.
On the other hand, there is the need to protect the privacy and integrity of citizens. And unfortunately, this latter right is very frequently abused by criminals. So there is a constant pull and tug between these interests. And laws are very rarely watertight, which makes the independence of the judicial system and its separation from the executive powers the pillar of stability… And obviously, if you look at the world over the past couple of years, the public has to trust and believe that this is the case, that the enforcement agency or the executive powers are separated from the judicial system, and that the judicial system provides an independent interpretation of the laws. If that is the case, I think things work out well in the end. It’s environments where there is an overlap or no true independence between executive powers, legislative powers, and judicial powers where things can go horribly wrong, in terms of government and its citizens, and handling of data.
Vinay Joosery: You also had mentioned earlier, you know, when it comes to… and that’s maybe more specifically to these digital IDs, right? There is a separation between the private ID and the employment ID, right? Your ID as an employee, right? Do you want to elaborate on that now?
Dragoljub Nesic: Sure. At Freja, we believe, and actually think, and are very sure of that there is an important distinction between one’s personal and employment identities. And I’ll take a very current example of a social worker, say, employed in Stockholm City. And Stockholm City is very far ahead in terms of digitalization. And this person is about to digitally sign a decision to, due to domestic abuse, place children from a family into foster care, to protect the children. Now, the rules require this signature to be made with one of these Level of Assurance 3 Electronic Identities, which I’ve talked about earlier. And up to recently, the only way to do so was using your private ID issued by the banks.
Now, this has one huge problem, which is that such a signature contains your name, surname, and the civic registration number, if you use it to sign. So now, ponder the disinformation campaign raging on the internet right now, where Swedish authorities have been accused of kidnapping Muslim children and placing them in foster care with gay couples. You know, in such an atmosphere, with the amount of misunderstanding and hatred that such campaigns stir up, does it sound reasonable that the signature over the documents where these kids are placed into foster care contains the name, surname, and registration number of the signer? And from the civic registration number, you can very quickly jump to an address, and where that person is living, and relatives, and everything else. And in my opinion, it does not. You know, the signer is acting in the capacity of a social worker, rather than as a private person. So, leaking their personal information in such a crucial employment decision is not okay.
So, with that in mind, a couple of years ago, we extended the capability of the Freja platform to have an organization ID or multiple, you know, one can have several employment relationships and associate that with your personal ID. And by doing so, it allows things like signing such a decision as social worker number 23. So you’re doing so in the capacity of a social worker employed by Stockholm City at the time of the signature, rather than a person with a name, surname, and a civic registration number. So at least the civic registration number is out of the equation, obviously, for the protection of everybody having your name and surname. It might make sense but at least there is a choice, what is in that signature. And certainly, the civic registration number shouldn’t, because it’s too much of a data leakage, which can lead to unintentional consequences.
Vinay Joosery: Yeah, yeah, I understand, yeah. It’s about, in a way, protecting some people who have to do their jobs.
Dragoljub Nesic: Exactly.
Vinay Joosery: So, you know, looking ahead, and now we’re talking a bit more, you know, the sort of regulatory landscape. I mean, it’s great to see that Freja is adopting all these, you know, principles to preserve sort of privacy, right? And it’s not just the EU that has the GDPR, right? I mean, the CCPA in California, and I think it’s getting updated to be the CPRA, I think, from, you know, from January next year. So we see laws that are more consumer oriented rather than sort of more government oriented, so to speak. Do you think, will corporations implement more sort of proactive measures, right, just like Freja did, to protect individual privacy of their data, or with the solely focus on compliance, which is basically stopping at what’s required of them?
Dragoljub Nesic: Whoa, that’s a tough question. I guess, most companies will, by their nature, try to focus on core business and do only what is required. Because remember, these questions of data privacy are extremely important, but you could always say a regulatory fatigue, as well, you know, social fatigue, because there are other demands, as well. You cannot forget about sustainability. You cannot forget about equality, or social responsibility, all of which have demands on resources that a company has. And that same company is operating in a very competitive world with a playing field that need not be equal for all players.
Just take this previous example of a company operating in Sweden and somebody operating elsewhere with less controls, and so on. So I guess that some companies will go beyond what is required. And I’d venture to say that it’s primarily those that fall into the category where data privacy is their core business, because there it comes as natural to do the best you can and everything you can to fulfill the data privacy needs.
And I guess, Freja falls into that category. We, as founders, are strong advocates of data privacy. That’s why we launched this service. And we built that in the way of thinking of everything we do. But another company, that does something completely different as a core business, I don’t know, makes cars or sells food, they have many other demands to take care of. And it would be only reasonable to expect that they try to stay afloat by doing what they have to, perhaps not stretching them too far in any particular direction in order to survive and grow. That would be my take on it.
Vinay Joosery: Yeah. Well, it makes sense. Companies need to compete, need to survive, so they will do whatever it takes to survive. And then there’s also… I mean, if you have investors behind you, then there’s growth, growth, growth at all costs in a way. That was a great rundown of, you know, I would say the challenges companies are having, when it comes to how to treat their data, because on one side, they need to be trusted by people. They’re taking care of other people’s private data. And then, on the other side, yeah, you have great competition out there. And yeah, you got to do everything to make sure that you can compete and be the best. And then, if you have investors and things like that, then there’s a huge force to actually, you know, grow even faster and do more. So, yeah. But I guess we’ll see what the future holds. So, Neša, you did an Ironman a few months ago. So why don’t you tell us a bit about that?
Dragoljub Nesic: Well, one is none, two is one. Actually, this was my second Ironman. And truth to be said, I did the first one in 2019. And believe it or not, the day after I was feeling so well that I registered for another one, in 2020. And, you know, probably it’s the adrenaline of the race or whatever it is, because this year, when I started preparing, and the race was postponed twice, I started thinking, why did I do this? But it’s always like that. You know, the mountain looks very steep and then… but once you climb it, it’s great.
So, somehow I suspected you’re going to ask me this. So I actually brought one of these to show to the viewers. This is what you get if you compete in Ironman. And it’s really heavy. I don’t know whether you can hear this sound. I venture that pretty much anybody without severe health issues can compete and complete in an Ironman race. It’s not as awkward as it sounds. It’s difficult to win any age group. That is a tough nut. Completing the race may sound very difficult, and I think the time limit to most of them is like 16-17 hours. I’ve talked to many athletes, both young and older than me, and it’s a fantastic experience.
I think Kalmar, in particular, has one of the very best running courses, of the highest rating of all running courses in the Ironman set of races. And it’s incredible. It’s three loops through the city center, and through some suburbs where people actually build stalls and, you know, cheer throughout the race, that is like 17 hours. It’s unbelievable. So, as an experience, I would recommend it to anyone. You know, to me, personally, it helps me not to work. You know, participating in an Ironman race and preparing allows me to do something else. Otherwise, I’m afraid I’d be like a goldfish. I would… you know, you feed them and they die. You give me work, I’ll just kill myself. So, it’s a good work-life balance. And by the way, my kids have moved out, so I have the luxury of spending time. And my wife has a hobby. That’s also very important. So that’s how we live.
Vinay Joosery: All right. Okay, great, Neša. That’s great to hear. Well, on these words, again, thank you so much for joining us.
Dragoljub Nesic: Thank you for inviting me. It was a pleasure spending some time with you.
Vinay Joosery: Yeah. Okay, then. Well, thank you, folks. That’s it for today. And until next time, bye for now.