File size: 2,803 Bytes
cf33275
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
"""Modal app exposing a deep_log_analysis web endpoint.

This is called by the Modal MCP server (mcp_servers/modal_server.py).
It expects a JSON body of the form:

{
  "service": "recs-api",
  "env": "prod",
  "logs": [
    {"timestamp": "...", "service": "...", "env": "...",
     "severity": "ERROR", "message": "...", "region": "..."},
    ...
  ]
}

and returns a JSON object with some aggregate stats and a short summary.
"""

from __future__ import annotations

from collections import Counter
from typing import Any, Dict, List

import modal


# Web endpoints using modal.fastapi_endpoint now require FastAPI to be installed
# explicitly in the container image.
image = modal.Image.debian_slim().pip_install("fastapi[standard]")

app = modal.App("incident-deep-analysis", image=image)


@app.function()
@modal.fastapi_endpoint(method="POST", docs=True)
def deep_log_analysis(payload: Dict[str, Any]) -> Dict[str, Any]:
    service = payload.get("service")
    env = payload.get("env")
    logs: List[Dict[str, Any]] = payload.get("logs") or []

    # Basic stats over the logs we received
    severity_counts: Counter[str] = Counter()
    regions: Counter[str] = Counter()
    latest_error: Dict[str, Any] | None = None

    for entry in logs:
        sev = str(entry.get("severity", "UNKNOWN"))
        severity_counts[sev] += 1
        region = str(entry.get("region", "unknown"))
        regions[region] += 1

        if sev in {"ERROR", "CRITICAL"}:
            # keep the last error we see (logs are usually newest-first)
            latest_error = entry

    top_region, top_region_count = (None, 0)
    if regions:
        top_region, top_region_count = regions.most_common(1)[0]

    summary_lines = []
    summary_lines.append(
        f"Deep analysis for service '{service}' in env '{env}' over {len(logs)} log entries."
    )

    if severity_counts:
        parts = [f"{sev}={count}" for sev, count in severity_counts.items()]
        summary_lines.append("Severity distribution: " + ", ".join(parts) + ".")

    if latest_error is not None:
        summary_lines.append(
            "Latest high-severity event: "
            f"[{latest_error.get('severity')}] {latest_error.get('message')} "
            f"at {latest_error.get('timestamp')} (region={latest_error.get('region')})."
        )

    if top_region is not None:
        summary_lines.append(
            f"Region with most activity: {top_region} ({top_region_count} events)."
        )

    summary = " ".join(summary_lines)

    return {
        "service": service,
        "env": env,
        "log_count": len(logs),
        "severity_counts": dict(severity_counts),
        "top_region": top_region,
        "top_region_count": top_region_count,
        "latest_error": latest_error,
        "summary": summary,
    }