From Surf Wiki (app.surf) — the open knowledge base
Image analogy
An image analogy is a method of creating an image filter automatically from training data. In an image analogy process, the transformation between two images A and A' is "learned". Later, given a different image B, its "analogy" image B' can be generated based on the learned transformation.
The image analogy method has been used to simulate many types of image filters:
- Toy filters, such as blurring or "embossing";
- Texture synthesis from an example texture;
- Super-resolution, inferring a high-resolution image from a low-resolution source;
- Texture transfer, in which images are "texturized" with some arbitrary source texture;
- Artistic filters, in which various drawing and painting styles, including oil, pastel, and pen-and-ink rendering, are synthesized based on scanned real-world examples;
- Texture-by-numbers, in which realistic scenes, composed of a variety of textures, are created using a simple "painting" interface;
- Image colorization, where color is automatically added to grayscale images.
This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.
Ask Mako anything about Image analogy — get instant answers, deeper analysis, and related topics.
Research with MakoFree with your Surf account
Create a free account to save articles, ask Mako questions, and organize your research.
Sign up freeThis content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.
Report