Update content/article.md
Browse files- content/article.md +1 -1
content/article.md
CHANGED
|
@@ -264,7 +264,7 @@ So what do we see? Llama is a basis for many models, and it shows.
|
|
| 264 |
Radically different architectures such as mamba have spawned their own dependency subgraph.
|
| 265 |
[code relatedness](d3_dependency_graph.html)
|
| 266 |
|
| 267 |
-

|
| 266 |
|
| 267 |
+
![[graph_modular_related_models.png]]
|
| 268 |
|
| 269 |
But there is no similar miracle for VLMs across the board.
|
| 270 |
As you can see, there is a small DETR island, a little llava pocket, and so on, but it's not comparable to the centrality observed.
|