Ngày tải lên :
24/04/2014, 12:29
... 9
10
[6]
L.
Baoqing.
Distance-based
selection
of
potential
support
vector
Incremental
Learning
Step
by
kernel
matrix.
In
International
symposium
on
Neural
(f)
Networks
2004,
LNCS
3173,pp.
468-473,2004
Fig.
2.
Performance
of
two
incremental
learning
algorithms
[7]
D.
Tax.:
One-class
classification.
Ph
D
thesis,
Delft
University
of
From
figure
2
we
can
see
after
each
step
of
incremental
Technology,
htp://www.phtn.tudelft.nl/-davidt/thesispdf
(2001)
training,
the
variation
of
the
predication
accuracy
on
the
test
set
is
not
various,
which
satisfy
the
requirement
of
algorithm
[8]
N
A
Syed,
H
Liu,
K
Sung.
From
incremental
learning
to
model
stability.,
and
we
can
discovery
the
algorithm
improvement
is
independent
instance
selection
-
a
support
vector
machine
gradually
improved
and
algorithm
and
the
algorithm
own
the
approach,
Technical
Report,
TRA9/99,
NUS,
1999
ability
of
performance
recoverability.
So
our
incremental
ablgoithmo
perfopo
ned
inrthisoperabmeets
the
duriremand
l
o
[9]
L
Yangguang,
C
Qi,
T
yongchuan
et
al.
Incremental
updating
method
for
support
vector
machine,
Apweb2004,
LNCS
3007,
incremental
learnig.
pp.
426-435,
2004.
The
experiment
results
show,
our
algorithm
has
the
similar
learning
performance
compared
with
the
popular
[10]
S
R
Gunn.
Support
vector
machines
for
classification
and
ISVM
algorithm
presented
in
[9].
Another
discovery
in
our
regression.
Technical
Report,
Inage
Speech
and
Intelligent
experiment
is
with
the
gradually
performing
of
our
Systems
Research
Group,
University
of
Southampton,
1997
incremental
learning
algorithm,
the
improvement
of
learning
performance
become
less
and
less,
and
at
last
,
the
learning
performance
no
longer
improve.
It
indicates
that
we
can
estimate
the
needed
number
of
samples
required
in
problem
description
by
using
this
character.
5.
Conclusion
In
this
paper
we
proposed
an
incremental
learning
algorithm
based
on
support
vector
domain
classifier
(SVDC),
and
its
key
idea
is
to
obtain
the
initial
concept
using
standard
SVDC,
then
using
the
updating
technique
presented
in
this
paper,
in
fact
which
equals
to
solve
a
QP
problem
similar
to
that
existing
in
standard
SVDC
algorithm
solving.
Experiments
show
that
our
algorithm
is
effective
and
promising.
Others
characters
of
this
algorithm
include:
updating
model
has
similar
mathematics
form
compared
with
standard
SVDC,
and
we
can
acquire
the
sparsity
expression
of
its
solutions,
meanwhile
using
this
algorithm
can
return
last
step
without
extra
computation,
furthermore,
this
algorithm
can
be
used
to
estimate
the
needed
number
of
samples
required
in
problem
description
REFERENCES
[1]
C.
Cortes,
V.
N.
Vapnik.:
Support
vector
networks,
Mach.
Learn.
20
(1995)
pp.
273-297.
[2]
.V.
N.
Vapnik.:
Statistical
learning
Theory,
Wiley,
New
York,
1998.
809
... ,~
NJ}
a
description
iS
required.
We
try
to
find
a
kre:Kxz=pJ1X_12
221
a>.
{xs,
i
nd
1.,}ac
dscp
requre
e
W
wtr
tindma
To
determine
whether
a
test
point
is
z
within
the
closed
and
compact
sphere
area
Q
with
minimum
sphere,
the
distance
to
the
center
of
the
sphere
has
to
be
volume,
which
contain
all
(or
most
of)
the
needed
objects
calculated.
A
test
object
z
accepted
when
this
distance
is
Q,
and
the
outliers
are
outside
Q.
Figure
1
shows
the
small
than
the
radius,
i.e.,
when
(z
-
a)T
(z
-a)
<
R2.
sketch
of
Support
Vector
Domain
Description
(SVDD).
Expressing
the
center
of
the
sphere
in
term
of
the
support
support
vector
vector,
we
accept
objects
when
Z-a
2
=
K(z,z) ... '~=0e
80
/
,<<<
[4]
S.
Tong.,
E.,
Chang,.:
Support
Vector
Machine
Active
Learning
75
for
Image
Retrieval.Proceedings
of
ACM
International
iEi
70
/
,,"Conference
on
Multimedia,
2000,
pp
107-118.
65
,
[5]
Yang
Deng
.
et
al.
A
new
method
in
data
mining
support
55
vector
machines.
Beijing:
Science
Press,
2004.
1
2
3
4 5
6
7 8 9
10
[6]
L.
Baoqing.
Distance-based
selection
of
potential
support
vector
Incremental
Learning
Step
by
kernel
matrix.
In
International
symposium
on
Neural
(f)
Networks
2004,
LNCS
3173,pp.
468-473,2004
Fig.
2.
Performance
of
two
incremental
learning
algorithms
[7]
D.
Tax.:
One-class
classification.
Ph
D
thesis,
Delft
University
of
From
figure
2
we
can
see
after
each
step
of
incremental
Technology,
htp://www.phtn.tudelft.nl/-davidt/thesispdf
(2001)
training,
the
variation
of
the
predication
accuracy
on
the
test
set
is
not
various,
which
satisfy
the
requirement
of
algorithm
[8]
N
A
Syed,
H
Liu,
K
Sung.
From
incremental
learning
to
model
stability.,
and
we
can
discovery
the
algorithm
improvement
is
independent
instance
selection
-
a
support
vector
machine
gradually
improved
and
algorithm
and
the
algorithm
own
the
approach,
Technical
Report,
TRA9/99,
NUS,
1999
ability
of
performance
recoverability.
So
our
incremental
ablgoithmo
perfopo
ned
inrthisoperabmeets
the
duriremand
l
o
[9]
L
Yangguang,
C
Qi,
T
yongchuan
et
al.
Incremental
updating
method
for
support
vector
machine,
Apweb2004,
LNCS
3007,
incremental
learnig.
pp.
426-435,
2004.
The
experiment
results
show,
our
algorithm
has
the
similar
learning
performance
compared
with
the
popular
[10]
S
R
Gunn.
Support
vector
machines
for
classification
and
ISVM
algorithm
presented
in
[9].
Another
discovery
in
our
regression.
Technical
Report,
Inage
Speech
and
Intelligent
experiment
is
with
the
gradually
performing
of
our
Systems
Research
Group,
University
of
Southampton,
1997
incremental
learning
algorithm,
the
improvement
of
learning
performance
become
less
and
less,
and
at
last
,
the
learning
performance
no
longer
improve.
It
indicates
that
we
can
estimate
the
needed
number
of
samples
required
in
problem
description
by
using
this
character.
5.
Conclusion
In
this
paper
we
proposed
an
incremental
learning
algorithm
based
on
support
vector
domain
classifier
(SVDC),
and
its
key
idea
is
to
obtain
the
initial
concept
using
standard
SVDC,
then
using
the
updating
technique
presented
in
this
paper,
in
fact
which
equals
to
solve
a
QP
problem
similar
to
that
existing
in
standard
SVDC
algorithm
solving.
Experiments
show
that
our
algorithm
is
effective
and
promising.
Others
characters
of
this
algorithm
include:
updating
model
has
similar
mathematics
form
compared
with
standard
SVDC,
and
we
can
acquire
the
sparsity
expression
of
its
solutions,
meanwhile
using
this
algorithm
can
return
last
step
without
extra
computation,
furthermore,
this
algorithm
can
be
used
to
estimate
the
needed
number
of
samples
required
in
problem
description
REFERENCES
[1]
C.
Cortes,
V.
N.
Vapnik.:
Support
vector
networks,
Mach.
Learn.
20
(1995)
pp.
273-297.
[2]
.V.
N.
Vapnik.:
Statistical
learning
Theory,
Wiley,
New
York,
1998.
809
...