How To Do An InfoSec Con Right: ISSA-LA Summit Does It Again [Part 1]


By Francesco Cipollone

Location, location, location and amazing content is enough of a reason to describe why APPSEC-CALI and ISSA-LA are rapidly becoming THE security conferences in SoCal and the places to be.

But if that’s not enough to make sure it’s on your to-do list for 2020 let me break down the ISSA-LA Summit into specific sessions to tell you more.


The content of the conference has always been good and packed full of excellent talks but this year there was a range of new features.

First of all as a supporter of women in cybersecurity and being mentored by one great and amazing Jane Frankland. I appreciate the diversity theme and the support of the Summit-XI of the woman in cybersecurity.


For me, the most interesting part of these conferences is still the technical talks, but then I am biased. However, I did particularly enjoy Sean’s law enforcement presentation about fraud cases. It was definitely something different at a security conference, and inspiration for all the identity fraud cases.

Identity theft—even if not completely digital—and the techniques used were a great example of how the traditional crimes and cybercrimes are now so closely linked, which I will speak more about later on.

But, in true British fashion, I can’t really go any further without mentioning the weather.

The conference, which was held right on the beach front, got underway while it was raining!! Now I didn’t go to LA for rain, we get enough of that at home…


Thankfully though, the mood was not dampened, and Richard Greenberg kicked off the conference with his usual cheerful and good energetic vibe, reminding everyone that security is everyone’s job and helping the OWASP chapter grow.

He shared a story about when he started in infosec and did not have much background in application and programming, which really resonated with me because I would say participating at meetups and conferences like this has given me so much more than standard dry training.

The conferences and meet-ups like OWASP provide a real insight into the industry as well as best practices and what has and hasn’t worked in the industry; all from industry leaders.

That sounds like a free lunch, right? But don’t be fooled, you would still need to do the legwork, studying and testing in order to achieve your best; nonetheless, going to conferences definitely helps prepare you for what’s in the market and where you should focus.



Wendy NatherYOLO Solving authentication in the age of Cloud

After Richard’s welcoming opening speech, Wendy Nather rocked the show with probably the best and simplest explanation of identity and permission I’ve ever seen. The talk built on the topic of identity to move into the concepts of the Zero Trust network.

Last time I discussed identity with Paul Simmonds, from the original Jericho forum, it was not an easy message to understand.

Wendy did a brilliant job describing identity and permissions in simple terms. Focusing on the different concepts of identity, permission and roles/rights that are the pillars for well-defined authentication.

She also did a wonderful job annoying the hell out of any kid in the world (including her own) by using “cool” words like YOLO in a completely different way—You Only Login Once —an evolution of single sign-on.

Note: For older folks like me (or not so old) YOLO usually translates as You Only Live Once—an encouragement to live life.

The topic was really close to my heart because I said I have spent a lot of time discussing it with Paul, and frictionless was a key delivery factor especially for adoption and secure use. If it is simple then people will be less likely to try to bypass the security measures put in place.


To summarise the identity trinity:

  • Identity is fundamentally who you are and your traits — nonetheless being Joe does not prove who you are or if you have access to your home.

  • Rights are the containers of what you could potentially do but rights alone do not give you access — if you are Joe and have the right to go into Joe’s home that does not mean you can enter without Joe’s keys (the permissions).

  • Permissions are the ultimate part of the triple process to get access somewhere. Joe with the identity card (identity) and the right to access to his home (certificate of ownership) has the permission to access his home by using his keys (permissions).


Now if someone gets hold of Joe’s keys or clones his keys, they will potentially be able to access his home.

Normally James, Joe’s nosy neighbour with too much time on his hands would be on the lookout, for whoever tries to get in Joe’s home without being Joe.

The problem arises when the intruder looks and acts like Joe.

If he looks like Joe and acts like Joe, he probably is Joe and James, Joe’s nosy neighbour, won’t call the police to alert them.

Wendy went on to explain the problem of intruders looking like legitimate users and behaving like them would lead to the controller (police and nosey neighbours) not identifying the problem properly.

I briefly talk about how to protect digital identities from profiling and similar attack techniques for everyone in this article:


The talk moved from the pure concept of identities to a more detailed investigation into the modern challenges in this digital and distributed world.

The main challenges highlighted were:

  • Physical base authentication and levels of trust

  • Mobile authentication

  • Ubiquitous access to devices and challenges of the cloud

Traditionally the concept of ‘trust’ and authentication went hand in hand. If your laptop was in the inner perimeter (the right side of the firewall) it was automatically trusted.


In most organisations, this risk-based approach was enough but due to the increase of cybersecurity threats from malicious insiders (attackers or disgruntled employees) and the rise of the Bring Your Own Device (BYOD) policy the concept of the perimeter has become weaker and weaker.

For some statistics, I’ve used Figure 6 of the Verizon Data Breach Investigations Report (DBIR) of 2019.


For this reason, in 2014, Google came up with BeyondCorp or Zero Trust Networks. The BeyondCorp model has been implemented and interpreted in many different ways with Forrester Zero Trust, Google BeyondCorp, CARTA from Gartner and many more paid for options.


Each one was a slight twist on the same concept:

  • No device is automatically trusted

  • The device gets authenticated on a regular basis

  • Anyone is untrusted (hence no access) by default unless they can prove who they are

As I mentioned though, the concept was implemented in different ways.

The Google way:


Intel version:


The idea of location had its troubles especially when that location is ephemeral nowadays and does not imply a trust model.


The concept of identity can’t stay limited at a password, which is an outdated idea, as Wendy said:

Something you know... you will forget

Something you will lose

This is why the model moved to a step-by-step process (from easier to more complex) to a Zero Trust network.

Step / Level 1 - Easiest


This model relies on reducing the attack surface with a black and white list of devices that will be admitted. This model is easy to maintain but onerous.

Step / Level 2 - Discovery


This model relies on finding out new devices and each new one gets challenged with 2-factor authentication (2FA) and then trusted.

This enables the creation of a clean list of devices.

The next step would be to head towards a profile for the user.


Step/ Level 3 - Gain Visibility


This relies on visualising any device that is on the network. A device is then trusted if corporate owned or enrolled or well known (e.g. BYOD).

This model works best if the users are linking to the individual devices and binding users to devices prevents the hijack of the session or credential as there is a 1:1 relationship between the user and the device

The Device+User becomes a ‘lightweight fence’ and the real new perimeter.

Step / Level 4 is the ultimate level


This level sees the device linked to users with policies and profiling.

It gives the users freedom for policy: if you want to access a number of systems then you get asked a number of policies.

Depending on the action within the system the user might get additional requests (e.g. if particular actions are required the user get challenged with permission request).

Don’t enforce MDM and policies across the whole user base as this could end up in bypass or rejection.



Jim Manico rocking the show with a power through the OWASP top 10 — OWASP Top 10 Proactive Controls


Jim is a friend who not only owns the stage like a boss, as a public speaker and toastmaster, he also has a breadth of knowledge that breath-taking :-)

A beach selfie was most definitely required especially when I found out that Jim is surprisingly more Italian than me!


Jim absolutely rocked the stage with a real top quality technical talk and he also powered through the OWASP Top 10 with examples, cases and practical applications (like the injection via email).


He started by covering AVSV, a new assessment tool which is a must-have in your security arsenal.

Before he explained the OWASP Top 10, which for anyone who is unfamiliar with the term is the flagship project from OWASP.


But before he tackled that he also had a good crack at the third party libraries and why it is so important to care for that and verifying that they are sanitised.

They are important because we don’t have that much time anymore to fix vulnerabilities or frameworks. The usual time you have for fixing vulnerabilities used to be it’s hours…

If a vulnerability does not get fixed asap than there is someone that would start fiddling with it...especially if you are an organisation of significant size (read as Equifax data breach / Apache Struts CVE-217-5638).

And still business haven’t learned the lesson.


So how do we manage third-party libraries? And do we scan code for vulnerabilities? Because everyone makes mistakes, especially if they are rushed.


Vulnerability scanners options:

Input/Data Validation...the many ways


Why validate data? To avoid injections of course. Jim made a great example where the statement n.2 implies an always valid regardless of what you do you will receive a yes...that’s the way to dump a client table from a production database!

The plague of XSS


Note: XSS stands for Cross-site scripting:

Encoding and input validation is a way to filter input and avoid XSS. XSS usually affects client data where a script is passed through.


Just another example where code gets injected inside a piece of HTML.

How to avoid it?

Do not allow scripting or encoding of the script in any of your input forms!

Another example:


How can you avoid this? Well, do not allow the ‘<’ and convert them (&lt;) into uninterpreted language with an encoder or input parser. This converts a harmful script into gibberish.


He then went onto the various modes of input validation:

  • Known bad input (blacklist) - expensive in time but effective only on limited occasion e.g. negative words that you don’t want to see

  • Known bad input (whitelist) - effective e.g. any word in English

  • Exact whitelist - the most effective but quite complex to maintain - e.g. just a subset of words will be accepted

Fixing the redirects


Why secure the redirects?

Because this avoids a lot of malicious subpages.

E.g. a site might be perfectly legitimate (e.g.g but the /subpage might be malicious. So just allow fixed URLs.

Avoid redirects all-together and validate the URL in code.


Identity is a complicated problem to fix


Don’t try to fix them by yourself, rely on 800-63-3.

The recommendation from Jim is to go through the paper several times and follow the principles.

But, nonetheless, it relies on external identity services as it’s easier to do so and possibly they would do a better job than individual organisations. Nonetheless, at least you can rely on established standards (e.g. OAuth 2).


Thank you, Jim, for your knowledge and your Hawaiian awesomeness!

Lunchtime calls


Lunchtime calls for some chill in the sun and enjoying the weather (also the aforementioned beach shot with Jim ).

Also, time to visit some of the vendors.



Cassio Goldsmith — and the transformation from Cloud to Containers

Cassio, the original creator of the OWASP chapter in LA, shared some thoughts on the security aspect and the transformation from the data centre into the cloud.


The traditional cyber risk remains, regardless of whether it is from a container or not. IoT and other industrial attacks still affect our everyday life.

Ransomware also still plagues us all, especially in the Cloud.


We went from physical servers and private clouds (VM) into the various cloud services (IaaS/SaaS/PaaS/FaaS).

I usually use a very good analogy to explain the cloud services.


The pizza as a service:

  • You have the Do it yourself pizza…your traditional datacenter and you have to build it yourself…but you make a mess

  • (IaaS) You want less work, so you go and buy a frozen pizza (ugh)…but it doesn’t taste as good

  • (PaaS) Then you order the pizza for delivery but it arrives cold…

  • (SaaS) Finally, you get up and decide to go to a restaurant for your pizza…

After we’ve migrated to the cloud, the next step is microservices, containers and function as a service.

The containers rely on hardened images and the microservices utilise the same set of base images. This takes away the problem of hardening every single individual virtual machine.

The last level of abstraction is Function as a Service.

Those environments provide a place to run code and don’t expose any OS.

The OS up to the application container where the app is running is the Cloud Security Provider Responsibility while the code is the client’s responsibility. One of the responsibilities is actually security.

Those can be costly and require a lot of trusts (with your provider). Nonetheless, you won't have to worry about the security of the OS, and application supporting it...only about the code getting executed.


In summary, there is a wide range of abstraction (from physical to logical) but it is all a matter of the trust and security level you want to put into your service provider or do it yourself.


In this serverless environment, the service gets to spin up and die very quickly so the traditional security method doesn’t work. Automation is the key to win in this world.


With an agile method, the automation and automatic vulnerability assessment is the key in this ever-changing and fast-paced environment.


Cloud and shadow IT is a big subject, especially within an organisation. You won't be able to stop users purchasing shadow IT...unless you make good friends with the finance department and partner up. This enables you to stop the purchase of shadow IT before it happens.


Another key element as an IT Security professional is the contract review.

We are not lawyers but we do need to review SLA and terms and conditions, so the best thing to do is partner up with a lawyer, which maximizes the use of resources and expertise.


Going back to the previous topic, anything that moves so fast can’t be secured by humans...hence the best way forward is to secure it with automation and software.


CI/CD pipelines need to be integrated with Continuous Security.

BCP and DR should be kept in mind when using services, as certain components can become a bottleneck (e.g. see recent failure in Google Cloud environments).


Last but not least, picking up from Jim’s topic. Don’t trust external code and always check it whether you have libraries or not.

The external code can be an attack vector to your organisation or simply introduce faults.


IoT is the latest trend (or plague) in cybersecurity. All the existing manufacturers, without much software engineering background, start inserting software modules everywhere. This will inevitably lead to mistakes.

There is now more effort introduced to fix those kinds of problems but we are still in the early days of regulations.

Nonetheless, we have know IoT devices are a target for attacks or botnets as well as being harmful to a human (e.g. a hack the driving system of cars).





The first day was full of content and the location amazing, with top quality speakers. And the weather donated us some amazing the end.

The weather was a bit iffy in the beginning, but the location is still amazing and logistic are impeccable. I am just wondering what happens if the conference becomes more popular...but also I’m not because it feels nice and familiar in the size of small conf.

Next year I will be going over with my famous cloud speech: Is the Cloud Secure? It is if you do it right, as well as my new talk: Security Architecture Slayers of Dragons and Defender of the Realms.

I would definitely go back to enjoy more knowledge sharing but somehow the theme of the conference got a little bit lost and it was hard to work out who they were trying to target. The various themes seemed a bit disconnected from each other.

Location wise Santa Monica is always a nice place for a run on the beach in the morning or evening or even both :-)


Los Angeles also offers some amazing art scene in downtown and the Disney music hall is spectacularly shaped.


I promised myself that I would see more of the art museums and I had the pleasure to have a visit to one of the newest, which has the biggest chair and inflatable dog I have ever seen….American like the maxi size.


Part 2 is coming soon. Stay tuned.

About Francesco Cipollone

Cybersecurity Cloud Expert, Head of Security Architecture HSBC & virtual CISO Elexon


Francesco is an accomplished, motivated and versatile Security Professional with several years of experience in the Cyber security landscape. He helps organizations achieve strategic security goals with a driven and pragmatic approach.

Francesco is the Director of Events for the cloud security alliance, active public speaker and writer on Medium. Francesco is the founder of the cybersecurity consultancy in London: NSC42 Ltd and previously co-founder of Technet SrL.

Find Francesco on Twitter {@FrankSEC42} and LinkedIn.