<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.bwhpc.de/wiki/index.php?action=history&amp;feed=atom&amp;title=Helix%2FbwVisu%2FKI-Morph</id>
	<title>Helix/bwVisu/KI-Morph - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.bwhpc.de/wiki/index.php?action=history&amp;feed=atom&amp;title=Helix%2FbwVisu%2FKI-Morph"/>
	<link rel="alternate" type="text/html" href="https://wiki.bwhpc.de/wiki/index.php?title=Helix/bwVisu/KI-Morph&amp;action=history"/>
	<updated>2026-04-24T01:14:05Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.39.17</generator>
	<entry>
		<id>https://wiki.bwhpc.de/wiki/index.php?title=Helix/bwVisu/KI-Morph&amp;diff=13419&amp;oldid=prev</id>
		<title>H Schumacher: created page</title>
		<link rel="alternate" type="text/html" href="https://wiki.bwhpc.de/wiki/index.php?title=Helix/bwVisu/KI-Morph&amp;diff=13419&amp;oldid=prev"/>
		<updated>2024-12-04T14:27:25Z</updated>

		<summary type="html">&lt;p&gt;created page&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{|style=&amp;quot;background:#FEF4AB; width:100%;&amp;quot;&lt;br /&gt;
|style=&amp;quot;padding:5px; background:#FEF4AB; text-align:left&amp;quot;|&lt;br /&gt;
This page is work in progress.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
[https://emcl.iwr.uni-heidelberg.de/research/projects/ki-morph KI-Morph] is a research project that aims to solve a big problem in biological research. When scientists use X-ray tomography to create 3D images of biological specimens—like parts of cells or whole organs—they generate a lot of data. Although it&amp;#039;s relatively quick to take these 3D images, analyzing them can take a lot of time because it&amp;#039;s usually done by hand.&lt;br /&gt;
&lt;br /&gt;
== Motivation ==&lt;br /&gt;
&lt;br /&gt;
=== Automating Analysis ===&lt;br /&gt;
&lt;br /&gt;
KI-Morph is working on developing a system that can automatically process and analyze these large sets of image data. This means instead of researchers spending hours, days, or even weeks looking through the images and interpreting them, the system would handle much of this work, speeding up the whole process.&lt;br /&gt;
&lt;br /&gt;
=== Handling Huge Amounts of Data ===&lt;br /&gt;
&lt;br /&gt;
The project focuses on creating a framework that can work with extremely large amounts of data — up to petabytes. To put this in perspective, one petabyte is equivalent to about 1,000 terabytes or 1,000,000 gigabytes. Handling this much data efficiently is a significant challenge.&lt;br /&gt;
&lt;br /&gt;
== Contact ==&lt;br /&gt;
&lt;br /&gt;
* [https://emcl.iwr.uni-heidelberg.de/people/zeilmann-alexander Alexander Zeilmann]&lt;/div&gt;</summary>
		<author><name>H Schumacher</name></author>
	</entry>
</feed>