Quine Logs
Full Recipe¶
Shared by: Michael Aglietti
This recipe processes Quine log lines using a regular expression.
Quine Logs Recipe
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 |
|
Scenario¶
In this scenario, we process the Quine log output to manifest a graph to aide in troubleshooting.
Sample Data¶
Sample data is created using Quine itself by launching Quine with the java property: thatdot.loglevel=INFO
.
❯ java -Dthatdot.loglevel=DEBUG -jar quine-1.7.3.jar > quine.log
Graph is ready
Quine web server available at http://localhost:8080
Verify that Quine is running with the /api/v1/admin/readiness
API endpoint.
❯ http GET http://localhost:8080/api/v1/admin/readiness
HTTP/1.1 204 No Content
Shutdown Quine using the /api/v1/admin/shutdown
API endpoint.
❯ http POST http://localhost:8080/api/v1/admin/shutdown
HTTP/1.1 202 Accepted
You now have a file containing a series of INFO events produced during Quine startup.
2023-02-10 10:16:02,410 INFO [NotFromActor] [main] com.thatdot.quine.app.Main$ - Running 1.5.1 with 10 available cores and 12GiB max heap size.
2023-02-10 10:16:02,890 INFO [NotFromActor] [graph-service-akka.actor.default-dispatcher-5] com.thatdot.quine.persistor.ExceptionWrappingPersistenceAgent - Persistence backend for: core quine data is at: Version(13.0.0), this is usable as-is by: Version(13.0.0)
2023-02-10 10:16:02,970 INFO [NotFromActor] [graph-service-akka.actor.default-dispatcher-5] com.thatdot.quine.graph.GraphService - Adding a new local shard at idx: 0
2023-02-10 10:16:02,972 INFO [NotFromActor] [graph-service-akka.actor.default-dispatcher-5] com.thatdot.quine.graph.GraphService - Adding a new local shard at idx: 1
2023-02-10 10:16:02,972 INFO [NotFromActor] [graph-service-akka.actor.default-dispatcher-5] com.thatdot.quine.graph.GraphService - Adding a new local shard at idx: 2
2023-02-10 10:16:02,972 INFO [NotFromActor] [graph-service-akka.actor.default-dispatcher-5] com.thatdot.quine.graph.GraphService - Adding a new local shard at idx: 3
2023-02-10 10:16:02,980 INFO [NotFromActor] [graph-service-akka.actor.default-dispatcher-14] com.thatdot.quine.persistor.ExceptionWrappingPersistenceAgent - Persistence backend for: Quine app state is at: Version(1.1.0), this is usable as-is by: Version(1.1.0)
2023-02-10 10:16:19,364 INFO [NotFromActor] [graph-service-akka.actor.default-dispatcher-7] com.thatdot.quine.persistor.ExceptionWrappingPersistenceAgent - Persistence backend for: core quine data is at: Version(13.0.0), this is usable as-is by: Version(13.0.0)
How it Works¶
The recipe reads Quine log events from a file using ingest streams to manifest a graph in Quine. The filename is passed into Quine at runtime using --recipe-value in_file={quine.log}
INGEST-1 processes the quine.log
file:
- type: FileIngest
path: $in_file
format:
type: CypherLine
query: |-
WITH text.regexFirstMatch($that, "(^\\d{4}-\\d{2}-\\d{2} \\d{1,2}:\\d{2}:\\d{2},\\d{3}) (FATAL|ERROR|WARN|INFO|DEBUG) \\[(\\S*)\\] \\[(\\S*)\\] (\\S*) - (.*)") AS r WHERE r IS NOT NULL
WITH r, split(r[3], "/") as path,
split(r[6], "(") as msgPts
WITH r, path, msgPts, replace(COALESCE(split(path[2], "@")[-1], 'No host'),")","") as qh
MATCH (actor), (msg), (class), (host)
WHERE id(host) = idFrom("host", qh)
AND id(actor) = idFrom("actor", r[3])
AND id(msg) = idFrom("msg", r[0])
AND id(class) = idFrom("class", r[5])
SET host.address = split(qh, ":")[0],
host.port = split(qh, ":")[-1],
host.host = qh,
host: Host
SET actor.address = r[3],
actor.id = replace(path[-1],")",""),
actor.shard = path[-2],
actor.type = path[-3],
actor: Actor
SET msg.msg = r[6],
msg.path = path[0],
msg.type = split(msgPts[0], " ")[0],
msg.level = r[2],
msg: Message
SET class.class = r[5],
class: Class
WITH * CALL reify.time(datetime({date: localdatetime(r[1], "yyyy-MM-dd HH:mm:ss,SSS")})) YIELD node AS time
CREATE (host)<-[:ON_HOST]-(actor)-[:SENT]->(msg),
(actor)-[:OF_CLASS]->(class),
(msg)-[:AT_TIME]->(time)
{
"type": "FileIngest",
"path": "$in_file",
"format": {
"type": "CypherLine",
"query": "WITH text.regexFirstMatch($that, \"(^\\\\d{4}-\\\\d{2}-\\\\d{2} \\\\d{1,2}:\\\\d{2}:\\\\d{2},\\\\d{3}) (FATAL|ERROR|WARN|INFO|DEBUG) \\\\[(\\\\S*)\\\\] \\\\[(\\\\S*)\\\\] (\\\\S*) - (.*)\") AS r WHERE r IS NOT NULL \nWITH r, split(r[3], \"/\") as path,\n split(r[6], \"(\") as msgPts\nWITH r, path, msgPts, replace(COALESCE(split(path[2], \"@\")[-1], 'No host'),\")\",\"\") as qh\n\nMATCH (actor), (msg), (class), (host)\nWHERE id(host) = idFrom(\"host\", qh)\n AND id(actor) = idFrom(\"actor\", r[3])\n AND id(msg) = idFrom(\"msg\", r[0])\n AND id(class) = idFrom(\"class\", r[5])\n\nSET host.address = split(qh, \":\")[0],\n host.port = split(qh, \":\")[-1],\n host.host = qh,\n host: Host\n\nSET actor.address = r[3],\n actor.id = replace(path[-1],\")\",\"\"),\n actor.shard = path[-2],\n actor.type = path[-3],\n actor: Actor\n\nSET msg.msg = r[6],\n msg.path = path[0],\n msg.type = split(msgPts[0], \" \")[0],\n msg.level = r[2],\n msg: Message\n\nSET class.class = r[5],\nclass: Class\n\nWITH * CALL reify.time(datetime({date: localdatetime(r[1], \"yyyy-MM-dd HH:mm:ss,SSS\")})) YIELD node AS time\n\nCREATE (host)<-[:ON_HOST]-(actor)-[:SENT]->(msg),\n (actor)-[:OF_CLASS]->(class),\n (msg)-[:AT_TIME]->(time)"
}
}
The regular expression parses each log file into parts.
(^\d{4}-\d{2}-\d{2} \d{1,2}:\d{2}:\d{2},\d{3}) (FATAL|ERROR|WARN|INFO|DEBUG) \[(\S*)\] \[(\S*)\] (\S*) - (.*)
0: whole matched line
1: date time string
2: log level
3: actor address. Might be inside of `akka.stream.Log(…)`
4: thread name
5: logging class
6: Message
You can explore the regular expression in detail saved on regex101.
Running the Recipe¶
❯ java -jar quine-1.7.3.jar -r quine-logs-recipe.yaml --recipe-value in_file=quine.log
Graph is ready
Running Recipe: Quine Log Reader
Using 4 node appearances
Using 8 quick queries
Using 2 sample queries
Running Ingest Stream INGEST-1
Quine web server available at http://localhost:8080
INGEST-1 status is completed and ingested 8
Tip
We've included a series of Quick Queries to help explore the graph. The are available by right clicking on a node displayed in the Exploration UI.