Raftul cu initiativa Book Archive

Machine Theory

Nearest-Neighbor Methods in Learning and Vision by Gregory Shakhnarovich, Trevor Darrell, Piotr Indyk

By Gregory Shakhnarovich, Trevor Darrell, Piotr Indyk

Regression and class equipment according to similarity of the enter to saved examples haven't been usual in purposes related to very huge units of high-dimensional information. fresh advances in computational geometry and computer studying, even though, may well alleviate the issues in utilizing those equipment on huge information units. This quantity offers theoretical and functional discussions of nearest-neighbor (NN) tools in computing device studying and examines computing device imaginative and prescient as an software area during which the good thing about those complicated equipment is frequently dramatic. It brings jointly contributions from researchers in concept of computation, computing device studying, and desktop imaginative and prescient with the objectives of bridging the gaps among disciplines and offering state of the art tools for rising applications.
The individuals specialise in the significance of designing algorithms for NN seek, and for the comparable category, regression, and retrieval initiatives, that stay effective while the variety of issues or the dimensionality of the information grows very huge. The ebook starts with theoretical chapters on computational geometry after which explores how one can make the NN technique plausible in computer studying purposes the place the dimensionality of the information and the scale of the information units make the naïve tools for NN seek prohibitively pricey. the ultimate chapters describe winning purposes of an NN set of rules, locality-sensitive hashing (LSH), to imaginative and prescient initiatives.

Show description

Read or Download Nearest-Neighbor Methods in Learning and Vision PDF

Best machine theory books

Digital and Discrete Geometry: Theory and Algorithms

This booklet offers accomplished insurance of the trendy equipment for geometric difficulties within the computing sciences. It additionally covers concurrent subject matters in information sciences together with geometric processing, manifold studying, Google seek, cloud info, and R-tree for instant networks and BigData. the writer investigates electronic geometry and its comparable confident equipment in discrete geometry, providing distinctive tools and algorithms.

Artificial Intelligence and Symbolic Computation: 12th International Conference, AISC 2014, Seville, Spain, December 11-13, 2014. Proceedings

This ebook constitutes the refereed complaints of the twelfth foreign convention on synthetic Intelligence and Symbolic Computation, AISC 2014, held in Seville, Spain, in December 2014. The 15 complete papers offered including 2 invited papers have been rigorously reviewed and chosen from 22 submissions.

Statistical Language and Speech Processing: Third International Conference, SLSP 2015, Budapest, Hungary, November 24-26, 2015, Proceedings

This ebook constitutes the refereed court cases of the 3rd overseas convention on Statistical Language and Speech Processing, SLSP 2015, held in Budapest, Hungary, in November 2015. The 26 complete papers offered including invited talks have been rigorously reviewed and chosen from seventy one submissions.

Extra resources for Nearest-Neighbor Methods in Learning and Vision

Sample text

Submetrics. Plainly, any (U , D ), where U ⊂ U and D is D restricted to U × U , is a metric space. Nearest-Neighbor Searching and Metric Space Dimensions 23 ˆ be the cross-product U1 × U2 × . . Ud , that is, the d– Products. Let U ˆ as follows: tuples over the Ui . For some value p with 1 ≤ p ≤ ∞, define D ˆ let for x, y ∈ U, 1/p ˆ y) := D(x, Di (xi , yi) p , i the product metric. When all Ui = and Di (x, y) = |x − y|, this yields d ˆ y) = x − y p . When p = d = 2, this with the p distance measures, so D(x, is simply the Euclidean plane.

2. Reverse Queries. Another related problem is that of building a data structure for reverse or inverse nearest neighbor queries, where the input is similar, but the answer is not the site nearest to query point q, but rather the sites that have q as their (second) nearest neighbor in S ∪ {q}, that is, the answer is the set of sites {s ∈ S | D(s, q) ≤ D(s, S \ {s})}. As with (forward) nearest-neighbor searching, this problem also can be generalized with k and : given k and site q, find all sites that have q as kth nearest neighbor, or given > 0, find all sites such that q is within 20 Nearest-Neighbor Searching and Metric Space Dimensions 1 + of closest.

The p ∈ P that minimizes D(p, q) is of course the nearest in P to q, while the p ∈ P that minimizes D(p, s) is the nearest in P to s. So if some p ∈ P is close to q and far from s, or close to s and far from q, then it gives a proof that s cannot be close to q. These considerations suggest that one major piece of information for a query point q is the closest site in P . Next we will consider how such information can be used, together with the doubling constant and doubling measure conditions, to suggest some data structures for nearest-neighbor searching that have provable properties.

Download PDF sample

Rated 4.96 of 5 – based on 20 votes