Lecture4 AccessControl Authorization

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 89

Access Control

Abu Sayed Md. Mostafizur Rahaman


Professor
Department of Computer Science and Engineering
Jahangirnagar University
Chapter 8:
Authorization
Abu Sayed Md. Mostafizur Rahaman, PhD
Professor
Department of Computer Science and Engineering
Jahangirnagar University
Authentication vs Authorization

 Authentication  Are you who you say you are?


o Restrictions on who (or what) can access system

 Authorization  Are you allowed to do that?


o Restrictions on actions of authenticated users

 Authorization is a form of access control

3
Authentication vs Authorization

 Authentication  Are you who you say you are?


o Restrictions on who (or what) can access system
 Authorization  Are you allowed to do that?
o Restrictions on actions of authenticated users
 Authorization is a form of access control
 Two fundamental concepts in the field authorization…
o Access Control Lists (ACLs)
o Capabilities (C-lists)
o Both ACLs and C-lists are derived from Lampson’s access control matrix,
which has a row for every subject and a column for every object. 4
Lampson’s Access Control Matrix
 Subjects (users) index the rows
 Objects (resources) index the columns
Accounting Accounting Insurance Payroll
OS program data data data

Bob rx rx r  

Alice rx rx r rw rw

Sam rwx rwx r rw rw


Accounting
program rx rx rw rw rw
5
Capabilities (or C-Lists)
 Store access control matrix by row
 Example: Capability for Alice is in red

Accounting Accounting Insurance Payroll


OS program data data data

Bob rx rx r  

Alice rx rx r rw rw

Sam rwx rwx r rw rw


Accounting
program rx rx rw rw rw
6
Are You Allowed to Do That?

 Access control matrix has all relevant info


 Could be 100’s of users, 10,000’s of resources
o Then matrix has 1,000,000’s of entries
 How to manage such a large matrix?
o Note: We need to check this matrix before access to any resource by any user

 How to make this more efficient/practical?

7
Acceptable performance for authorization
operations
 ACL is split into more manageable pieces.
 Solution
o First, split the matrix into its columns and store each column with
its corresponding object (Payroll Data)

o Second, store the access control matrix by row, where each row is
stored with its corresponding subject.

8
ACLs vs Capabilities
 Note that arrows point in opposite directions…
 With ACLs, still need to associate users to files
r r
Alice --- file1 Alice w file1
r rw

w ---
Bob r file2 Bob r file2
--- r

rw r
Fred r file3 Fred --- file3
r r

Access Control List Capability

9
Confused Deputy
 The “confused deputy” illustrates a
classic security problem  Access control matrix
 Two resources
Compiler BILL
o Compiler and BILL file (billing info)
 Compiler can write file BILL Alice x 
 Alice can invoke compiler with a debug
Compiler rx rw
filename
 Alice not allowed to write to BILL

See next Slide

10
ACL’s and Confused Deputy
 Compiler is deputy acting on behalf of Alice
 Compiler is confused
o Alice is not allowed to write BILL
 Compiler has confused its rights with Alice’s

debug BILL
e BIL L
filenam
Compiler

Alice BILL
With ACLs, it’s difficult to avoid the confused deputy.
11
Confused Deputy
 Compiler acting for Alice is confused
 With ACLs, more difficult to prevent this
 With Capabilities, easier to prevent problem
o Must maintain association between authority and intended purpose
 Capabilities  easy to delegate authority

12
ACLs vs Capabilities
 ACLs
o Good when users manage their own files
o Protection is data-oriented
o Easy to change rights to a resource
 Capabilities
o Easy to delegate  avoid the confused deputy
o Easy to add/delete users
o More difficult to implement
 ACLs are used in practice far more often than capabilities

13
Multilevel Security (MLS) Models

The purpose of an MLS system is to enforce a form of access control


by restricting subjects to objects for which they have the necessary
clearance
Classifications and Clearances

 Classifications apply to objects


 Clearances apply to subjects
 US Department of Defense (DoD) uses 4 levels:
TOP SECRET
SECRET
CONFIDENTIAL
UNCLASSIFIED

15
Clearances and Classification

 To obtain a SECRET clearance requires a routine background check


 A TOP SECRET clearance requires extensive background check
 Practical classification problems
o Proper classification not always clear

16
Subjects and Objects

 Let O be an object, S a subject


o O has a classification
o S has a clearance
o Security level denoted L(O) and L(S)
 For DoD levels, we have
TOP SECRET > SECRET > CONFIDENTIAL > UNCLASSIFIED

17
 MLS needed when subjects/objects at
different levels access same system
 MLS is a form of Access Control
Multilevel  Military
and government interest in
Security MLS for many decades
o Lots of research into MLS
(MLS)
o Strengths and weaknesses of MLS well
understood (almost entirely theoretical)
o Many possible uses of MLS outside
military
18
MLS Applications
 Classified government/military systems
 Business example: info restricted to
 Senior management only, all management, everyone in company, or
general public
 Network firewall
 Confidential medical info, databases, etc.
 Usually, MLS not really a technical system
 More like part of a legal structure

19
 Classified government/military systems
 Business example: information restricted to
o Senior management only, all management,
MLS everyone in company, or general public

Applications  Network firewall


 Confidential medical info, databases, etc.
 Usually, MLS not really a technical system
o More like part of a legal structure

20
 MLS models explain what needs to be done
 Models do not tell you how to implement
 Models are descriptive, not prescriptive
o That is, high-level description, not an algorithm
MLS Security  There are many MLS models
Models  We’ll discuss simplest MLS model
o Other models are more realistic
o Other models also more complex, more difficult
to enforce, harder to verify, etc.

21
 BLP securitymodel designed to
express essential requirements for
MLS
 BLP deals with confidentiality
Bell-LaPadula o To prevent unauthorized reading
 Recall that O is an object, S a subject
o Object O has a classification
o Subject S has a clearance
o Security level denoted L(O) and L(S)

22
 BLP consists of
Simple Security Condition: S can read O
if and only if L(O)  L(S)
Bell-LaPadula *-Property (Star Property): S can write O
if and only if L(S)  L(O)
 No read up, no write down

23
 McLean: BLP is “so trivial that it is hard
to imagine a realistic security model for
which it does not hold”
 McLean’s “system Z” allowed
administrator to reclassify object, then
McLean’s “write down”
Criticisms of  Is this fair?
BLP  Violates spirit of BLP, but not expressly
forbidden in statement of BLP
 Raises fundamental questions about the
nature of (and limits of) modeling

24
 BLP is simple, probably too simple
 BLP is one of the few security models
that can be used to prove things about
systems
BLP: The  BLP has inspired other security
Bottom Line models
o Most other models try to be more realistic
o Other security models are more complex
o Models difficult to analyze, apply in
practice
25
 BLP for confidentiality, Biba for integrity
o Biba is to prevent unauthorized writing
 Biba is (in a sense) the dual of BLP
 Integrity model

Biba’s Model o Spse you trust the integrity of O but not O


o If object O includes O and O then you cannot
trust the integrity of O
 Integrity level of O is minimum of the integrity of
any object in O

26
 Let I(O) denote the integrity of object O and I(S)
denote the integrity of subject S
 Biba can be stated as
Write Access Rule: S can write O if and only if
I(O)  I(S)
Biba (if S writes O, the integrity of O  that of S)
Biba’s Model: S can read O if and only if
I(S)  I(O)
(if S reads O, the integrity of S  that of O)

27
Summery

28
MULTILATERAL SECURITY

• Multilevel security systems enforce access control (or information flow) “up and down”
where the security levels are ordered in a hierarchy.
• A simple hierarchy of security labels is not flexible enough to deal with a realistic situation.
• Multilateral security uses compartments to further restrict information flow “across”
security levels
Compartments
 Multilevel Security (MLS) enforces access control up and down
 Simple hierarchy of security labels is generally not flexible enough
 Compartments enforces restrictions across
 Suppose TOP SECRET divided into TOP SECRET {CAT} and
TOP SECRET {DOG}
 Both are TOP SECRET but information flow restricted across the
TOP SECRET level

30
Compartments
 Why compartments?
o Why not create a new classification level?
 May not want either of
o TOP SECRET {CAT}  TOP SECRET {DOG}
o TOP SECRET {DOG}  TOP SECRET {CAT}
 Compartments designed to enforce the need to know principle
o Regardless of clearance, you only have access to info that you need to know
to do your job

31
Compartments
 Arrows indicate “” relationship
TOP SECRET {CAT, DOG}

TOP SECRET {CAT} TOP SECRET {DOG}

TOP SECRET

SECRET {CAT, DOG}

SECRET {CAT} SECRET {DOG}

SECRET
 Not all classifications are comparable, e.g.,
TOP SECRET {CAT} vs SECRET {CAT, DOG}
32
MLS vs Compartments
 MLS can be used without compartments
o And vice-versa
 But, MLS almost always uses compartments
 Example
o MLS mandated for protecting medical records of British Medical Association (BMA)
o AIDS was TOP SECRET, while other less sensitive information, such as drug prescriptions,
was considered SECRET
o What is the classification of an AIDS drug?
o Anyone with a SECRET clearance could easily deduce TOP SECRET information. As a result,
all information tended to be classified at the highest level, and consequently all users required the
highest level of clearance, which defeated the purpose of the system
o Compartments-only approach used instead

33
Covert Channel
Covert Channel

 MLS designed to restrict legitimate channels of communication


 May be other ways for information to flow
 For example, resources shared at different levels could be used to “signal”
information
 Covert channel: a communication path not intended as such by system’s
designers

35
Covert Channel Example
 Alice has TOP SECRET clearance, Bob has CONFIDENTIAL
clearance
 Suppose the file space shared by all users
 Alice creates file FileXYzW to signal “1” to Bob, and removes file
to signal “0”
 Once per minute Bob lists the files
o If file FileXYzW does not exist, Bob believes that Alice sent 0
o If file FileXYzW exists, Alice sent 1
 Alice can leak TOP SECRET info to Bob

36
Covert Channel Example

Alice: Create file Delete file Create file Delete file

Bob: Check file Check file Check file Check file Check file

Data: 1 0 1 1 0

Time:

37
Covert Channel
 Other possible covert channels?
o Print queue
o ACK messages
o Network traffic, etc.
 When does covert channel exist?
1. Sender and receiver have a shared resource
2. Sender able to vary some property of resource that receiver can observe
3. “Communication” between sender and receiver can be synchronized

38
Covert Channel
 Potential covert channels are everywhere
 But, it’s easy to eliminate covert channels:
o “Just” eliminate all shared resources and all communication!
 Virtually impossible to eliminate covert channels in any useful
information system
o DoD guidelines: reduce covert channel capacity to no more than 1
bit/second
o Implication? DoD has given up on eliminating covert channels

39
Real-World Covert Channel
 Hide data in TCP header “reserved” field
 Or use covert_TCP, tool to hide data in
o Sequence number
o ACK number
bits
0 8 16 24 31

Source Port Destination Port


Sequence Number
Acknowledgement Number
Offset reserved U A P R S F Window
Checksum Urgent Pointer
Options Padding
Data (variable length)
40
Real-World Covert Channel
 Hide data in TCP sequence numbers
 Tool: covert_TCP
 Sequence number X contains covert info

ACK (or RST)


SYN Source: B
Spoofed source: C Destination: C
Destination: B ACK: X
SEQ: X B. Innocent
server

A. Covert_TCP C. Covert_TCP
sender receiver

41
Inference Control
Inference Control Example
 Suppose we query a database
o Question: What is average salary of female CS professors at SJSU?
o Answer: $95,000
o Question: How many female CS professors at SJSU?
o Answer: 1
 Specific information has leaked from responses to general questions!

43
Inference Control & Research
 Forexample, medical records are private but valuable for
research
 How to make info available for research and protect
privacy?
 How to allow access to such data without leaking specific
information?

44
Naïve Inference Control
 Remove names from medical records?
 Still
may be easy to get specific info from such
“anonymous” data
 Removing names is not enough
o As seen in previous example
 What more can be done?

45
Less-naïve Inference Control
 Query set size control
o Don’t return an answer if set size is too small
 N-respondent, k% dominance rule
o Do not release statistic if k% or more contributed by N or fewer
o Example: Avg salary in Bill Gates’ neighborhood
o This approach used by US Census Bureau
 Randomization
o Add small amount of random noise to data
 Many other methods  none satisfactory

46
Netflix Example
 Netflix prize  $1M to first to improve recommendation
system by 10% or more
 Netflix created dataset for contest
o Movie preferences of real users
o Usernames removed, some “noise” added
 Insufficient inference control
o Researchers able to correlate IMDB reviews with those in Netflix
dataset

47
Firewalls
Firewalls
 Firewall decides what to let in to internal network and/or what to let
out
 Access control for the network

Internal
Internet Firewall network

49
Firewall as Secretary
 A firewall is like a secretary
 To meet with an executive
o First contact the secretary
o Secretary decides if meeting is important
o So, secretary filters out many requests
 You want to meet chair of CS department?
o Secretary does some filtering

50
Firewall Terminology
 No standard firewall terminology
 Types of firewalls
o Packet filter  works at network layer
o Stateful packet filter  transport layer
o Application proxy  application layer
 Lots of other terms often used
o E.g., “deep packet inspection”

51
Packet Filter
 Operates at network layer
 Can filters based on…
application
o Source IP address
o Destination IP address transport
o Source Port
network
o Destination Port
o Flag bits (SYN, ACK, etc.) link
o Egress or ingress
physical

52
Packet Filter
 Advantages?
o Speed application
 Disadvantages?
transport
o No concept of state
o Cannot see TCP connections network
o Blind to application data link

physical

53
Packet Filter
 Configured via Access Control Lists (ACLs)

Source Dest Source Dest Flag


Action IP IP Port Port Protocol Bits

Allow Inside Outside Any 80 HTTP Any

Allow Outside Inside 80 > 1023 HTTP ACK

Deny All All All All All All

54
TCP ACK Scan

 Attacker scans for open ports through firewall


o Port scanning often first step in network
attack
 Attacker sends packet with ACK bit set, without
prior 3-way handshake
o Violates TCP/IP protocol
o ACK packet pass through packet filter
firewall
o Appears to be part of an ongoing connection
o RST sent by recipient of such packet

55
TCP ACK Scan
 Attacker knows port 1209 open through firewall
 A stateful packet filter can prevent this
o Since scans not part of established connections

ACK dest port 1207

ACK dest port 1208

ACK dest port 1209

Trudy RST Internal


Packet
Network
Filter

56
Stateful Packet Filter
 Adds state to packet filter
 Operates at transport layer application

 Remembers TCP connections, flag bits, transport


etc.
network
 Can
even remember UDP packets (e.g.,
DNS requests) link

physical

57
Stateful Packet Filter
 Advantages?
o Can do everything a packet filter can do plus...
application
o Keep track of ongoing connections (e.g., prevents
TCP ACK scan) transport
 Disadvantages? network
o Cannot see application data
o Slower than packet filtering link

physical

58
Application Proxy
 A proxy is something that acts on your behalf application
 Application proxy looks at incoming
transport
application data
network
 Verifies that data is safe before letting it in
link

physical

59
Application Proxy
 Advantages?
o Complete view of connections and application
applications data
o Filter bad data at application layer (viruses, transport
Word macros)
network
 Disadvantages?
o Speed link

physical

60
Application Proxy
 Creates a new packet before sending it through to internal network
 Attacker must talk to proxy and convince it to forward message
 Proxy has complete view of connection
 Can prevent some scans stateful packet filter cannot.

61
Firewalk

 Tool to scan for open ports through firewall


 Attacker knows IP address of firewall and IP
address of one system inside firewall
o Set TTL to 1 more than number of hops to
firewall, and set destination port to N
 If firewall allows data on port N through firewall,
get time exceeded error message
o Otherwise, no response

62
Firewalk and Proxy Firewall
 This will not work through an application proxy (why?)
 The proxy creates a new packet, destroys old TTL
Packet
filter
Trudy Router Router Router

Dest port 12343, TTL=4


Dest port 12344, TTL=4
Dest port 12345, TTL=4
Time exceeded

63
Deep Packet Inspection
 Many buzzwords used for firewalls
o One example: deep packet inspection
 What could this mean?
 Look into packets, but don’t really “process” the packets
o Like an application proxy, but faster

64
Firewalls and Defense in Depth
 Typical network security architecture

DMZ

FTP server
Web server

DNS server

Intranet with
Packet Application additional
Internet Filter Proxy defense

65
Intrusion Detection Systems
Intrusion Prevention
 Want to keep bad guys out
 Intrusion prevention is a traditional focus of computer
security
o Authentication is to prevent intrusions
o Firewalls a form of intrusion prevention
o Virus defenses aimed at intrusion prevention
o Like locking the door on your car

67
Intrusion Detection
 In spite of intrusion prevention, bad guys will sometime get in
 Intrusion detection systems (IDS)
o Detect attacks in progress (or soon after)
o Look for unusual or suspicious activity
 IDS evolved from log file analysis
 IDS is currently a hot research topic
 How to respond when intrusion detected?
o We don’t deal with this topic here…

68
Intrusion Detection Systems
 Who is likely intruder?
o May be outsider who got thru firewall
o May be evil insider
 What do intruders do?
o Launch well-known attacks
o Launch variations on well-known attacks
o Launch new/little-known attacks
o “Borrow” system resources
o Use compromised system to attack others. etc.

69
IDS
 Intrusion detection approaches
o Signature-based IDS
o Anomaly-based IDS
 Intrusion detection architectures
o Host-based IDS
o Network-based IDS
 Any IDS can be classified as above
o In spite of marketing claims to the contrary!

70
Host-Based IDS
 Monitor activities on hosts for
o Known attacks
o Suspicious behavior
 Designed to detect attacks such as
o Buffer overflow
o Escalation of privilege, …
 Little or no view of network activities

71
Network-Based IDS
 Monitor activity on the network for…
o Known attacks
o Suspicious network activity
 Designed to detect attacks such as
o Denial of service
o Network probes
o Malformed packets, etc.
 Some overlap with firewall
 Little or no view of host-base attacks
 Can have both host and network IDS

72
Signature Detection Example
 Failed login attempts may indicate password cracking attack
 IDS could use the rule “N failed login attempts in M seconds” as
signature
 If N or more failed login attempts in M seconds, IDS warns of attack
 Note that such a warning is specific
o Admin knows what attack is suspected
o Easy to verify attack (or false alarm)

73
Signature Detection
 Suppose IDS warns whenever N or more failed logins in M seconds
o Set N and M so false alarms not common
o Can do this based on “normal” behavior
 But, if Trudy knows the signature, she can try N  1 logins every M
seconds…
 Then signature detection slows down Trudy, but might not stop her

74
Signature Detection
 Many techniques used to make signature detection more robust
 Goal is to detect “almost” signatures
 For example, if “about” N login attempts in “about” M seconds
o Warn of possible password cracking attempt
o What are reasonable values for “about”?
o Can use statistical analysis, heuristics, etc.
o Must not increase false alarm rate too much

75
Signature Detection
 Advantages of signature detection
o Simple
o Detect known attacks
o Know which attack at time of detection
o Efficient (if reasonable number of signatures)
 Disadvantages of signature detection
o Signature files must be kept up to date
o Number of signatures may become large
o Can only detect known attacks
o Variation on known attack may not be detected

76
Anomaly Detection
 Anomaly detection systems look for unusual or abnormal behavior
 There are (at least) two challenges
o What is normal for this system?
o How “far” from normal is abnormal?
 No avoiding statistics here!
o mean defines normal
o variance gives distance from normal to abnormal

77
How to Measure Normal?
 How to measure normal?
o Must measure during “representative” behavior
o Must not measure during an attack…
o …or else attack will seem normal!
o Normal is statistical mean
o Must also compute variance to have any reasonable idea of
abnormal

78
How to Measure Abnormal?
 Abnormal is relative to some “normal”
o Abnormal indicates possible attack
 Statistical discrimination techniques include
o Bayesian statistics
o Linear discriminant analysis (LDA)
o Quadratic discriminant analysis (QDA)
o Neural nets, hidden Markov models (HMMs), etc.
 Fancy modeling techniques also used
o Artificial intelligence
o Artificial immune system principles
o Many, many, many others
79
Anomaly Detection (1)
 Suppose we monitor use of three commands:
open, read, close
 Under normal use we observe Alice:
open, read, close, open, open, read, close, …
 Of the six possible ordered pairs, we see four pairs are normal for
Alice,
(open,read), (read,close), (close,open), (open,open)
 Can we use this to identify unusual activity?

80
Anomaly Detection (1)
 We monitor use of the three commands
open, read, close
 If the ratio of abnormal to normal pairs is “too high”, warn of
possible attack
 Could improve this approach by-
o Also use expected frequency of each pair
o Use more than two consecutive commands
o Include more commands/behavior in the model
o More sophisticated statistical discrimination

81
Anomaly Detection (2)
 Over time, Alice has accessed file Fn at rate Hn
Initial file access rate  Recently, “Alice”
has accessed Fn at
H0 H1 H2 H3
rate An
.10 .40 .40 .10
A0 A1 A2 A3
.10 .40 .30 .20

 Is this normal use for Alice?


 We compute S = (H0A0)2+(H1A1)2+…+(H3A3)2 = .02
o We consider S < 0.1 to be normal, so this is normal
 How to account for use that varies over time?
82
Anomaly Detection (2)
 To allow “normal” to adapt to new use, we update averages: Hn =
0.2An + 0.8Hn
 In this example, Hn are updated… H2=.2.3+.8.4=.38 and
H3=.2.2+.8.1=.12
 And we now have

H0 H1 H2 H3
.10 .40 .38 .12

83
Anomaly Detection (2)
 The updated long term average is
H0 H1 H2 H3
 Suppose new
observed rates…
.10 .40 .38 .12
A0 A1 A2 A3
.10 .30 .30 .30

 Is this normal use?


 Compute S = (H A )2+…+(H A )2 = .0488
0 0 3 3
o Since S = .0488 < 0.1 we consider this normal
 And we again update the long term averages:
Hn = 0.2An + 0.8Hn
84
Anomaly Detection (2)
 The starting averages were:  After 2 iterations,
H0 H1 H2 H3 averages are:
.10 .40 .40 .10
H0 H1 H2 H3
.10 .38 .364 .156

 Statistics slowly evolve to match behavior


 This reduces false alarms for SA
 But also opens an avenue for attack…
o Suppose Trudy always wants to access F
3
o Can she convince IDS this is normal for Alice?

85
Anomaly Detection (2)
 To make this approach more robust, must incorporate the variance
 Can also combine N stats Si as, say,
T = (S1 + S2 + S3 + … + SN) / N
to obtain a more complete view of “normal”
 Similar (but more sophisticated) approach is used in an IDS known
as NIDES
 NIDES combines anomaly & signature IDS

86
Anomaly Detection Issues
 Systems constantly evolve and so must IDS
o Static system would place huge burden on admin
o But evolving IDS makes it possible for attacker to (slowly) convince IDS
that an attack is normal
o Attacker may win simply by “going slow”
 What does “abnormal” really mean?
o Indicates there may be an attack
o Might not be any specific info about “attack”
o How to respond to such vague information?
o In contrast, signature detection is very specific

87
Anomaly Detection
 Advantages?
o Chance of detecting unknown attacks
 Disadvantages?
o Cannot use anomaly detection alone…
o …must be used with signature detection
o Reliability is unclear
o May be subject to attack
o Anomaly detection indicates “something unusual”, but lacks specific info on
possible attack

88
Anomaly Detection: The Bottom Line
 Anomaly-based IDS is active research topic
 Many security experts have high hopes for its ultimate success
 Often cited as key future security technology
 Hackers are not convinced!
o Title of a talk at Defcon: “Why Anomaly-based IDS is an Attacker’s Best
Friend”
 Anomaly detection is difficult and tricky
 As hard as AI?

89

You might also like