===============================================================
Lect 12 - November 4, 2008 - ECS 20 - Fall 2008 - Phil Rogaway
===============================================================
Today: o Functions
- SCRIBE (none today, use your own notes for the class notes)
Announcements:
o None!
------------------------------------------
1. Review of Functions
------------------------------------------
Definition: A function f is a relation on A x B such that
there is one and only one pair a R b for every in A.
Vocabulary: domain
co-domain
image
preimage
one-to-one / injective
onto / surjective
one-to-one and onto / surjective
equality, inequality (if f,g are arbitrary function, does f \N
hd(x) = the first character of the string x, x\ne emptystring
tl(x) = all but the first character of x (define how when x=\emptystring)?
dim(A) = the dimensions of the matrix A, regarded as a pair of natural numbers
x+y = sum of x and y in the integers, say: +: \Z^2 -> \Z
length(L) = T* -> \N for some type T
Two functions f and g are equal, f=g,
if their domains and ranges are equal
and f(x) = g(x) for all x in Dom(f)
Function composition
f o g
f: A -> B, g: B -> C
the (g o f) : A -> C is defined by
(g o f)(x) = g(f(x))
Kind of "backwards" notation, but it's traditional, "necessary"
if we want functions to operate with their argument to the right.
Some algebrists do in fact reverse it, writing
(x) (f o g)
Some computer scientists like to denote functions by "lambda expressions"
To say that f is the function that maps x to x^2 we write
f = lambda x. x^2
Here x is just a formal variable;
lambda x . x^2 = lambda y . y^2
Usually the domain is not explicitly indicated in this notation.
------------------------------------------
2. Examples
------------------------------------------
Make table:
injective surjective bijective
----------------------------------------------------------------
f: \R -> \R by yes yes yes
defined by
f(x) = x+1.
(Can write
x |--> x+1)
(Don't write
x -> x+1)
(Can write
f = lambda x.x+1)
----------------------------------------------------------------
f: \R -> \R defined no no no
by f(x) = x^2
----------------------------------------------------------------
f: \R+ -> \R+ defined yes yes yes
by f(x) = x^2
-------------------------------------------------------------------
bday: \people ->
[1..12] x [1..31] no yes yes
where bday(x) encodes
x's birthday
----------------------------------------------------------------
exp(x): \R -> (0,infty) yes yes yes
When bijective, the
*inverse* exists.
Define it
----------------------------------------------------------------
ln(x): (0,infty)-> \R yes yes yes
----------------------------------------------------------------
f(x) = 2x mod 6 no no no
from and to {0,1,2,3,4,5}
make a table
0 2 4 0 2 4
Z_5
----------------------------------------------------------------
f(x) = 2x mod 7 yes yes yes
from and to {0,1,...,6}
Z_7
----------------------------------------------------------------
Example: f(x) = exp(x) = e^x
Draw picture.
What's the domain? \R
What's the image? (0,\infty)
Is it 1-1 on this image? YES
What's it's inverse
e^x = y
x = ln(y)
Example: f(x) = x e^x = y draw it.
On what domain does the inverse exists?
answer: [0,infinity]
Could you define a larger portion of the curve on which the
inverse exists:
Yes, [-1/e .. infinity]
Graph the function. Take its derivative to see
that it vanishes at -1, when the function takes
on the value -1/e.
How would you find the inverse f^{-1}(2), for example?
x e^x = 2.
Binary search
d(n) = number of divisors of n
p(n) = number of prime numbers \le n
etc.
------------------------------------------------
3. Important functions
------------------------------------------------
\lceil x \rceil
lfloor x \rfloor
a mod b
|x|
e^x
2^x
3^x
log
ln
lg
n!
Review of properties of logs:
log(ab) = log(a) + log(b)
log_a(b) = log_c(b) / log_c(a)
e^ab = (e^a)^b
a^x a^y = a^{x+y}
Draw picture of y = lg(x)
------------------------------------------------
4. Comparing functions
------------------------------------------------
Which is bigger, x^3 or 2^x?
Nor formal meaning we know -- it depends on x.
But there is some sense in which 2^x seems bigger
Draw graphs.
We say x^3 \in O(2^x). Kind of like saying x^3 \le 2^x
"for large x and if you don't care about constants"
...............................................
Def (Def 3.4 from Schaum's)
O(g) = {f: \R -> \R such that exists N>0,C>0 s.t.
|f(n)| <= C |g(n)| for all n>=N}
...............................................
You can actually forget about the | . | and just
imagine that f is non-negative valued: replace f by |f|
in case it's not.
Example:
n^2 + 100n \in O(n^2) YES
10n^2 \in O(n^2) YES
10n^2 + 100n + log N in O(n^2) YES
n log n in O(n^2) YES
n^2 log in O(n log n) NO
How to prove things like this?
Suppose want to show
10 n log(n) + 50n + 1 in O(n log n)
must find a large enough C, N such that
10 n log(n) + 50n + 1 <= C n log n
for all n >= N.
What C, N would you like? would you like? How about
10 n log(n) + 50n + 1 <= 61 n log n
Check: true if
50n + 1 <= 51 n log n
Now 50n < 50n log n if n\ge 3
So the above is true if
1 <= n log n
But for n >= 3, this is certainly true.
Doing it in the forward direction:
1 <= n log n
10n log n + 50n + 1 <= 10n log n + 50 n log n + n log n <= 61 n log n
when n>=3
n! vs 2^n
n! \in O(2^n) NO
2^n \in O(n!) Yes
n! = (n/e)^n \sqrt(2\pi n) (1 + O(1/n))
ln n! \approx n ln n - n
n lg n n n lg n n^2 n^3 2^n
------------------------------------------------------------
10 4 10 40 100 10^3 1024
100 7 100 700 1000 10^6 10^30
1000 10 1000 10^4 10^6 10^9 10^300
...........................................
Theta(g) = {f: R -> R : exists c,C,N such that
c g(n) <= |f(n)| <= C g(n)
...........................................
Theta(n^2) contains: 7n^2, 3n^2 + 100lg n
doesn't contain n^2 lg n (too big)
n lg^2 n (too small)
Draw a picture of common growth rates
Theta(n!)
Theta(2^n)
Theta(n^3)
Theta(n^2)
Theta(n log n log log n)
Theta(n lg n)
Theta(n)
Theta(sqrt(n)
Theta(log n)
Theta(1)
exercise: where is \sqrt(n)
How to compare
n^0.5 n log n
n^0.5 n^0.5 n^0.5 log n
Bigger
How long does the following code take to run
for i=1 to n do
for j=1 to i do
s += (i+j)^2 - (i+j)
O(n^2)
Theta(n^2)
O(n^3) is true ... but not "tight"