Kifua AI: Why Chest X-ray Triage Matters in Low-Radiologist Settings

    Kifua AI: Why Chest X-ray Triage Matters in Low-Radiologist Settings

    By Fred MutisyaNovember 26, 2025
    RadiologyKifua AIAIX-rays

    It’s Monday morning at a county hospital. The X-ray room has already seen 40 patients by noon—children with severe pneumonia, adults with chronic cough, elderly patients with heart failure.

    There’s one radiologist covering the entire hospital—and they’re also responsible for CT scans, ultrasound, and teaching interns. Many chest X-rays are first interpreted by clinical officers or medical officers with limited radiology training.

    In this environment, delays and missed findings are almost inevitable.

    The Radiology Gap

    Across many Kenyan and African hospitals, we see the same pattern:

    • High volume of chest X-rays (CXRs).
    • Limited number of radiologists—sometimes none on-site.
    • Images piling up before a specialist can review them.

    When that happens:

    • A subtle but important opacity might be missed in a child with recurrent pneumonia.
    • A large pleural effusion might not be recognised as urgent.
    • A patient with TB-compatible findings might wait longer to be investigated and isolated.

    These are not failures of individual clinicians; they’re symptoms of a system that’s stretched far beyond its capacity.

    Why Triage, Not “AI Radiologist”

    When people hear “AI for X-rays,” they often imagine a machine replacing a radiologist.

    That’s not the goal of Kifua AI.

    Instead, we ask a more practical question:

    “Given 100 chest X-rays today, which ones should we look at first?

    This is what we mean by triage:

    • Prioritising images that are more likely to show serious abnormalities.
    • Flagging cases that might need urgent review or repeat imaging.
    • Giving non-radiologist clinicians a second set of eyes while they wait for specialist input.

    Kifua AI aims to help answer:

    • “Is this likely normal?”
    • “Is there something worrying here?”
    • “Should this X-ray be bumped up the queue?”

    How Kifua AI Fits into Workflow

    Imagine this workflow in a district hospital:

    1. The patient gets a chest X-ray as usual.
    2. The image is automatically sent to a local server running Kifua AI.
    3. Within seconds, Kifua AI:
    • Assigns a triage label: “High priority abnormal,” “Abnormal – routine review,” or “Likely normal.”
    • Highlights areas of interest on the X-ray (e.g. right lower zone consolidation).
    1. The clinician sees:
    • The original image.

    • The AI’s triage label and a short summary.

    • A heatmap overlay showing where the AI “looked.”

    The radiologist still makes the final call. But instead of going through 100 CXRs in arbitrary order, they can start with the 20 that Kifua AI thinks are most concerning.

    The Diseases We Care About

    Kifua AI focuses on conditions where early detection on CXR can change management:

    • Pneumonia – especially severe consolidation.
    • Tuberculosis – suspicious upper lobe infiltrates / cavitary lesions.
    • Heart failure and cardiomegaly.
    • Pleural effusion.
    • Other obvious abnormalities that warrant urgent review.

    Of course, AI cannot see everything, and it can be wrong. But as a triage assistant, even modest improvements in prioritisation could translate into:

    • Faster treatment for the sickest patients.
    • Less cognitive load on overworked clinicians.
    • More consistent detection of serious findings.

    Kifua AI as a Complement, Not a Competitor

    Our core philosophy mirrors that of Afya-Yangu AI:

    Kifua AI is a tool to support human clinicians, not to replace them.

    By focusing on triage:

    • We keep the human expert in the loop.
    • We respect the complexity of radiology as a discipline.
    • We target a realistic, high-impact problem: who gets seen first.

    In the next blogs, we’ll explore how Kifua AI is trained using the CheXpert dataset, why we use an ensemble of models, and how we make its decisions more interpretable with heatmaps and saliency maps.