Gauravadlakha1509 commited on
Commit
a5de9f5
·
verified ·
1 Parent(s): e66ab67

deploy at 2025-04-10 18:37:17.187360

Browse files
.dockerignore ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ .venv/
2
+ __pycache__/
3
+ *.pyc
4
+ .git/
5
+ .gitignore
6
+ .sesskey
7
+ uploads/*
8
+ !uploads/.gitkeep
.gitignore ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ .venv/
2
+ __pycache__/
3
+ *.pyc
4
+ .sesskey
5
+ uploads/*
6
+ !uploads/.gitkeep
7
+ *.pkl
Dockerfile ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ FROM python:3.10
2
+ WORKDIR /code
3
+ COPY --link --chown=1000 . .
4
+ RUN mkdir -p /tmp/cache/
5
+ RUN chmod a+rwx -R /tmp/cache/
6
+ ENV HF_HUB_CACHE=HF_HOME
7
+ RUN pip install --no-cache-dir -r requirements.txt
8
+
9
+ ENV PYTHONUNBUFFERED=1 PORT=7860
10
+ CMD ["python", "main.py"]
README.md CHANGED
@@ -1,10 +1,42 @@
1
  ---
2
- title: Img Cls1
3
- emoji: 🐨
4
- colorFrom: gray
5
- colorTo: blue
6
  sdk: docker
 
 
7
  pinned: false
8
  ---
9
 
10
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: FastHTML Image classifier
3
+ emoji: 🚀
4
+ colorFrom: blue
5
+ colorTo: indigo
6
  sdk: docker
7
+ sdk_version: "3.11"
8
+ app_file: app.py
9
  pinned: false
10
  ---
11
 
12
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
13
+
14
+ # FastHTML Hello World
15
+
16
+ A simple Hello World web application using FastHTML and MonsterUI for Hugging Face Spaces.
17
+
18
+ ## Features
19
+
20
+ - Clean, modern UI with MonsterUI styling
21
+ - Simple interactive button using HTMX
22
+ - Ready to deploy to Hugging Face Spaces
23
+
24
+ ## Deployment
25
+
26
+ To deploy this application to Hugging Face Spaces:
27
+
28
+ 1. Create a free account on [Hugging Face](https://huggingface.co)
29
+ 2. Go to your account settings and create an access token with write access
30
+ 3. Set the `HF_TOKEN` environment variable to that token
31
+ 4. Run `fh_hf_deploy <space_name>` replacing `<space_name>` with your desired space name
32
+
33
+ ## Local Development
34
+
35
+ To run this application locally:
36
+
37
+ ```bash
38
+ pip install -r requirements.txt
39
+ python app.py
40
+ ```
41
+
42
+ Then open your browser to [http://localhost:5001](http://localhost:5001)
app.py ADDED
@@ -0,0 +1,57 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from fasthtml_hf import setup_hf_backup
2
+ from fasthtml import FastHTML
3
+ from monsterui.core import Theme
4
+ from fasthtml.common import *
5
+ import os, uvicorn
6
+ from starlette.responses import FileResponse
7
+ from starlette.datastructures import UploadFile
8
+ from fastai.vision.all import *
9
+
10
+
11
+ theme = Theme.blue
12
+ app, rt = fast_app(hdrs=theme.headers())
13
+
14
+ os.makedirs("uploads", exist_ok=True)
15
+
16
+ def classify(image_path):
17
+ im = PILImage.create(image_path)
18
+ learn = load_learner("model.pkl")
19
+ cls,idx,probs = learn.predict(im)
20
+ return cls,probs[idx]
21
+
22
+
23
+ @app.get("/")
24
+ def home():
25
+ return Title("German Bread Classification"), Main(
26
+ H1("German Bread Classification App"),
27
+ Form(
28
+ Input(type="file", name="image", accept="image/*", required=True),
29
+ Button("Classify"),
30
+ enctype="multipart/form-data",
31
+ hx_post="/classify",
32
+ hx_target="#result"
33
+ ),
34
+ Br(), Div(id="result"),
35
+ cls="container"
36
+ )
37
+
38
+ @app.post("/classify")
39
+ async def handle_classify(image:UploadFile):
40
+
41
+ image_path = f"uploads/{image.filename}"
42
+ with open(image_path, "wb") as f:
43
+ f.write(await image.read())
44
+
45
+ result = classify(image_path)
46
+
47
+ return Div(
48
+ P(f"Classification result: {result}"),
49
+ Img(src=f"/uploads/{image.filename}", alt="Uploaded image", style="max-width: 300px;")
50
+ )
51
+
52
+ @app.get("/uploads/{filename}")
53
+ async def serve_upload(filename: str):
54
+ return FileResponse(f"uploads/{filename}")
55
+
56
+ setup_hf_backup(app)
57
+ serve()
config.ini ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ [DEFAULT]
2
+ dataset_id = space-backup
3
+ db_dir = data
4
+ private_backup = True
5
+ interval = 15
fastai-style-guide.txt ADDED
@@ -0,0 +1,346 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ fastai
2
+
3
+ * __
4
+
5
+ __
6
+
7
+ 1. fastai Development
8
+ 2. fastai coding style
9
+
10
+ * Welcome to fastai
11
+
12
+ * Quick start
13
+
14
+ * Tutorials __
15
+
16
+ * Tutorials
17
+
18
+ * Beginner __
19
+
20
+ * Computer vision intro
21
+
22
+ * Text transfer learning
23
+
24
+ * Tabular training
25
+
26
+ * Collaborative filtering tutorial
27
+
28
+ * Intermediate __
29
+
30
+ * Data block tutorial
31
+
32
+ * Training Imagenette
33
+
34
+ * Mid-tier data API - Pets
35
+
36
+ * Chest X-ray model
37
+
38
+ * Transformers
39
+
40
+ * Wikitext data tutorial
41
+
42
+ * Notebook distributed training
43
+
44
+ * Advanced __
45
+
46
+ * Custom transforms
47
+
48
+ * Custom new task - siamese
49
+
50
+ * Image sequences
51
+
52
+ * Migrating from Other Libs __
53
+
54
+ * Pure PyTorch to fastai
55
+
56
+ * Pytorch to fastai details
57
+
58
+ * Ignite with fastai
59
+
60
+ * Lightning with fastai
61
+
62
+ * Catalyst with fastai
63
+
64
+ * Training __
65
+
66
+ * Learner, Metrics, Callbacks
67
+
68
+ * Optimizers
69
+
70
+ * Metrics
71
+
72
+ * Interpretation of Predictions
73
+
74
+ * Distributed training
75
+
76
+ * Mixed precision training
77
+
78
+ * Channels Last training
79
+
80
+ * Callbacks __
81
+
82
+ * Callbacks
83
+
84
+ * Model hooks
85
+
86
+ * Progress and logging
87
+
88
+ * Hyperparam schedule
89
+
90
+ * Data Callbacks
91
+
92
+ * MixUp and Friends
93
+
94
+ * Predictions callbacks
95
+
96
+ * Callback for RNN training
97
+
98
+ * Tracking callbacks
99
+
100
+ * Training callbacks
101
+
102
+ * Data __
103
+
104
+ * Data block
105
+
106
+ * Data core
107
+
108
+ * DataLoaders
109
+
110
+ * External data
111
+
112
+ * Data transformations
113
+
114
+ * Core __
115
+
116
+ * Torch Core
117
+
118
+ * Layers
119
+
120
+ * Loss Functions
121
+
122
+ * Vision __
123
+
124
+ * Core vision
125
+
126
+ * Vision data
127
+
128
+ * Vision augmentation
129
+
130
+ * Vision learner
131
+
132
+ * Models __
133
+
134
+ * XResnet
135
+
136
+ * Dynamic UNet
137
+
138
+ * GAN
139
+
140
+ * Vision utils
141
+
142
+ * Vision widgets
143
+
144
+ * Text __
145
+
146
+ * Text core
147
+
148
+ * Text data
149
+
150
+ * Text learner
151
+
152
+ * Models __
153
+
154
+ * Core text modules
155
+
156
+ * AWD-LSTM
157
+
158
+ * Tabular __
159
+
160
+ * Tabular core
161
+
162
+ * Tabular data
163
+
164
+ * Tabular learner
165
+
166
+ * Tabular model
167
+
168
+ * Collaborative filtering
169
+
170
+ * Medical __
171
+
172
+ * Medical Imaging
173
+
174
+ * Medical Text
175
+
176
+ * Integrations __
177
+
178
+ * Wandb
179
+
180
+ * Captum
181
+
182
+ * Comet.ml
183
+
184
+ * Tensorboard
185
+
186
+ * Hugging Face Hub
187
+
188
+ * fastai Development __
189
+
190
+ * Pull requests made easy
191
+
192
+ * git Notes
193
+
194
+ * fastai Abbreviation Guide
195
+
196
+ * fastai coding style
197
+
198
+ * Working with GPU
199
+
200
+ * Notes For Developers
201
+
202
+ ## On this page
203
+
204
+ * fastai coding style
205
+ * Style guide
206
+ * Symbol naming
207
+ * Layout
208
+ * Algorithms
209
+ * Other stuff
210
+ * Documentation
211
+ * FAQ
212
+
213
+ * __Report an issue
214
+
215
+ ## Other Formats
216
+
217
+ * __CommonMark
218
+
219
+ ## fastai coding style
220
+
221
+ This is a brief discussion of fastai’s coding style, which is loosely informed by (a much diluted version of) the ideas developed over the last 60 continuous years of development in the APL / J / K programming communities, along with Jeremy’s personal experience contributing to programming language design and library development over the last 25 years. The style is particularly designed to be aligned with the needs of scientific programming and iterative, experimental development.
222
+
223
+ Everyone has strong opinions about coding style, except perhaps some very experienced coders, who have used many languages, who realize there’s lots of different perfectly acceptable approaches. The python community has particularly strongly held views, on the whole. I suspect this is related to Python being a language targeted at beginners, and therefore there are a lot of users with limited experience in other languages; however this is just a guess. Anyway, I don’t much mind what coding style you use when contributing to fastai, as long as:
224
+
225
+ * You don’t change existing code to reduce its compliance with this style guide (especially: don’t use an automatic linter / formatter!)
226
+ * You make some basic attempt to make your code not wildly different from the code that surrounds it.
227
+
228
+ Having said that, I do hope that you find the ideas in this style guide at least a little thought provoking, and that you consider adopting them to some extent when contributing to this library.
229
+
230
+ My personal approach to coding style is informed heavily by Iverson’s Turing Award (the “Nobel of computer science”) lecture of 1979, Notation as a Tool For Thought. If you can find the time, the paper is well worth reading and digesting carefully (it’s one of the most important papers in computer science history), representing the development of an idea that found its first expression in the release of APL in 1964. Iverson says:
231
+
232
+ > The thesis of the present paper is that the advantages of executability and universality found in programming languages can be effectively combined, in a single coherent language, with the advantages offered by mathematical notation
233
+
234
+ One key idea in the paper is that “ _brevity facilitates reasoning_ ”, which has been incorporated into various guidelines such as “shorten lines of communication”. This is sometimes incorrectly assumed to just mean ‘terseness’, but it is a much deeper idea, as described in this Hacker News thread. I can’t hope to summarize this thinking here, but I can point out a couple of key benefits:
235
+
236
+ * It supports expository programming, particularly when combined with the use of Jupyter Notebook or a similar tool designed for experimentation
237
+ * The most productive programmers I’m aware of in the world, such as the extraordinary Arthur Whitney often use this coding style (which may or may not be a coincidence!)
238
+
239
+ ## Style guide
240
+
241
+ Python has over time incorporated a number of ideas that make it more amenable to this form of programming, such as:
242
+
243
+ * List, dictionary, generator, and set comprehensions
244
+ * Lambda functions
245
+ * Python 3.6 interpolated format strings
246
+ * Numpy array-based programming.
247
+
248
+ Although Python will always be more verbose than many languages, by using these features liberally, along with some simple rules of thumb, we can aim to keep all the key ideas for one semantic concept in a single screen on code. This is one of my main goals when programming—I find it very hard to understand a concept if I have to jump around the place to put the bits together. (Or as Arthur Whitney says “I hate scrolling”!)
249
+
250
+ ### Symbol naming
251
+
252
+ * Follow standard Python casing guidelines (CamelCase classes, under_score for most everything else).
253
+ * In general, aim for what Perl designer Larry Wall describes as metaphorically as _Huffman Coding_ :
254
+
255
+ > In metaphorical honor of Huffman’s compression code that assigns smaller numbers of bits to more common bytes. In terms of syntax, it simply means that commonly used things should be shorter, but you shouldn’t waste short sequences on less common constructs.
256
+
257
+ * A fairly complete list of abbreviations is in abbr.qmd; if you see anything missing, feel free to edit this file.
258
+ * For example, that in computer vision code, where we say ‘size’ and ‘image’ all the time, we use shortened forms `sz` and `img`. Or in NLP code, we would say `lm` instead of ‘language model’
259
+ * Use `o` for an object in a comprehension, `i` for an index, and `k` and `v` for a key and value in a dictionary comprehension.
260
+ * Use `x` for a tensor input to an algorithm (e.g. layer, transform, etc), unless interoperating with a library where this isn’t the expected behavior (e.g. if writing a pytorch loss function, use `input` and `target` as is standard for that library).
261
+ * Take a look at the naming conventions in the part of code you’re working on, and try to stick with them. E.g. in `fastai.transforms` you’ll see ‘det’ for ‘deterministic’, ‘tfm’ for ‘transform’, and ‘coord’ for coordinate.
262
+ * Assume the coder has knowledge of the domain in which you’re working
263
+ * For instance, use `kl_divergence` not `kullback_leibler_divergence`; or (like pytorch) use `nll` not `negative_log_likelihood`. If the coder doesn’t know these terms, they will need to look them up in the docs anyway and learn the concepts; if they do know the terms, the abbreviations will be well understood
264
+ * When implementing a paper, aim to follow the paper’s nomenclature, unless it’s inconsistent with other widely-used conventions. E.g. `conv1` not `first_convolutional_layer`
265
+
266
+ Although it’s hard to design a really compelling experiment for this kind of thing, there is some interesting research supporting the idea that overly long symbol names negatively impact code comprehension.
267
+
268
+ ### Layout
269
+
270
+ * Code should be less wide than the number of characters that fill a standard modern small-ish screen (currently 1600x1200) at 14pt font size. That means around 160 characters. Following this rule will mean very few people will need to scroll sideways to see your code. (If they’re using a jupyter notebook theme that restricts their cell width, that’s on them to fix!)
271
+
272
+ * One line of code should implement one complete idea, where possible
273
+
274
+ * Generally therefore an `if` part and its 1-line statement should be on one line, using `:` to separate
275
+
276
+ * Using the ternary operator `x = y if a else b` can help with this guideline
277
+
278
+ * If a 1-line function body comfortably fits on the same line as the `def` section, feel free to put them together with `:`
279
+
280
+ * If you’ve got a bunch of 1-line functions doing similar things, they don’t need a blank line between them
281
+ [code] def det_lighting(b, c): return lambda x: lighting(x, b, c)
282
+
283
+ def det_rotate(deg): return lambda x: rotate_cv(x, deg)
284
+ def det_zoom(zoom): return lambda x: zoom_cv(x, zoom)
285
+ [/code]
286
+
287
+ * Aim to align statement parts that are conceptually similar. It allows the reader to quickly see how they’re different. E.g. in this code it’s immediately clear that the two parts call the same code with different parameter orders.
288
+ [code] if self.store.stretch_dir==0: x = stretch_cv(x, self.store.stretch, 0)
289
+
290
+ else: x = stretch_cv(x, 0, self.store.stretch)
291
+ [/code]
292
+
293
+ * Put all your class member initializers together using destructuring assignment. When doing so, use no spaces after the commas, but spaces around the equals sign, so that it’s obvious where the LHS and RHS are.
294
+ [code] self.sz,self.denorm,self.norm,self.sz_y = sz,denorm,normalizer,sz_y
295
+
296
+ [/code]
297
+
298
+ * Avoid using vertical space when possible, since vertical space means you can’t see everything at a glance. For instance, prefer importing multiple modules on one line.
299
+ [code] import PIL, os, numpy as np, math, collections, threading
300
+
301
+ [/code]
302
+
303
+ * Indent with 4 spaces. (In hindsight I wish I’d picked 2 spaces, like Google’s style guide, but I don’t feel like going back and changing everything…)
304
+
305
+ * When it comes to adding spaces around operators, try to follow notational conventions such that your code looks similar to domain specific notation. E.g. if using pathlib, don’t add spaces around `/` since that’s not how we write paths in a shell. In an equation, use spacing to lay out the separate parts of an equation so it’s as similar to regular math layout as you can.
306
+
307
+ * Avoid trailing whitespace
308
+
309
+ ### Algorithms
310
+
311
+ * fastai is designed to show off the best of what’s possible. So try to ensure that your implementation of an algorithm is at least as fast, accurate, and concise as other versions that exist (if they do), and use a profiler to check for hotspots and optimize them as appropriate (if the code takes more than a second to run in practice).
312
+ * Try to ensure that your algorithm scales nicely; specifically, it should work in 16GB RAM on arbitrarily large datasets. That will generally mean using lazy data structures such as generators, and not pulling everything in to a list.
313
+ * Add a comment that provides the equation number from the paper that you’re implementing in the appropriate part of the code.
314
+ * Use numpy/pytorch broadcasting, not loops, where possible.
315
+ * Use numpy/pytorch advanced indexing, not specialized indexing methods, where possible.
316
+ * Don’t submit a PR that implements the latest hot paper until you’ve actually tried using it on a few datasets, compared it to existing approaches, and confirmed it’s actually useful in practice! Ideally, include a notebook as a gist link with your PR showing these results.
317
+
318
+ ### Other stuff
319
+
320
+ * Feel free to assume the latest version of python and key libraries is installed. But do mention in the PR and docs if you’re relying on something that’s only a couple of months old (including recently fixed bugs). Don’t rely on any unreleased or beta versions however.
321
+ * Avoid comments unless they are necessary to tell the reader _why_ you’re doing something. To tell them _how_ you’re doing it, use symbol names and clear expository code.
322
+ * If you’re implementing a paper or following some other external document, include a link to it in your code.
323
+ * If you’re using nearly all the stuff provided by a module, just `import *`. There’s no need to list all the things you are importing separately! To avoid exporting things which are really meant for internal use, define `__all__`. currently following the `__all__` guideline, and welcome PRs to fix this.)
324
+ * Assume the user has a modern editor or IDE and knows how to use it. E.g. if they want to browse the methods and classes, they can use code folding - they don’t need to rely on having two lines between classes. If they want to see the definition of a symbol they can jump to the reference/tag, then don’t need a list of imports at the top of the file. And so forth…
325
+ * Don’t use an automatic linter like autopep8 or formatter like yapf. No automatic tool can lay out your code with the care and domain understanding that you can. And it’ll break all the care and domain understanding that previous contributors have used in that file!
326
+ * Keep your PRs small, and for anything controversial or tricky discuss it on the forums first.
327
+
328
+ ### Documentation
329
+
330
+ * Documentation largely goes in the notebooks in `docs_src`, which is used to create the HTML docs
331
+ * In the code, add a one-line docstring which includes back-quoted references to the main params by name
332
+ * The Python re module is a good role model for the documentation style we’re looking for.
333
+
334
+ ## FAQ
335
+
336
+ Why not use PEP 8?
337
+
338
+ I don’t think it’s ideal for the style of programming that we use, or for math-heavy code. If you’ve never used anything except PEP 8, here’s a chance to experiment and learn something new!
339
+ My editor is complaining about PEP 8 violations in fastai; what should I do?
340
+
341
+ Pretty much all editors have the ability to disable linting for a project; figure out how to do that in your editor.
342
+ Are you worried that using a different style guide might put off new contributors?
343
+
344
+ Not really. We’re really not that fussy about style, so we won’t be rejecting PRs that aren’t formatted according to this document. And whilst there are people around who are so closed-minded that they can’t handle new things, they’re certainly not the kind of people we want to be working with!
345
+
346
+
fasthtml-hf.txt ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # FastHTML on 🤗 Spaces
2
+
3
+ Deploy a FastHTML application to [Hugging Face Spaces](https://huggingface.co/spaces) for free with one command!
4
+
5
+ ## Quickstart
6
+
7
+ 1. Create a free account on [Hugging Face](https://huggingface.co)
8
+ 2. Go to your account settings and create an access token with write access. Keep this token safe and don't share it.
9
+ 3. Set the `HF_TOKEN` environment variable to that token
10
+ 4. Install fasthtml-hf: `pip install fasthtml-hf`
11
+ 5. HuggingFace needs `fasthtml-hf` to run your space, so add it to your requirements.txt file.
12
+ 6. At the top of your `main.py` add `from fasthtml_hf import setup_hf_backup`, and just before you run uvicorn add `setup_hf_backup(app)`. If you run uvicorn in an `if __name__ == "__main__"` block, you'll need to call `setup_hf_backup(app)` *before* the `if` block.
13
+ 7. Run `fh_hf_deploy <space_name>`, replacing `<space_name>` with the name you want to give your space.
14
+
15
+ By default this will upload a public space. You can make it private with `--private true`.
16
+
17
+ ## Configuration
18
+
19
+ The space will upload a backup of your database to a [Hugging Face Dataset](https://huggingface.co/datasets). By default it will be private and its name will be `<your-huggingface-id>/space-backup`. You can change this behavior in the `config.ini` file. In not provided, a default file will be created with the contents (note that the `[DEFAULT]` line is required at the top):
20
+
21
+ ```
22
+ [DEFAULT]
23
+ dataset_id = space-backup
24
+ db_dir = data
25
+ private_backup = True
26
+ interval = 15 # number of minutes between periodic backups
27
+ ```
28
+
29
+ If you so choose, you can disable the automatic backups and use [persistent storage](https://huggingface.co/docs/hub/en/spaces-storage#persistent-storage-specs) instead for $5/month (USD).
llms-ctx-MonsterUI.txt ADDED
The diff for this file is too large to render. See raw diff
 
llms-ctx-fastHtml.txt ADDED
The diff for this file is too large to render. See raw diff
 
requirements.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ fastai
2
+ python-fasthtml
3
+ monsterui
4
+ fasthtml-hf
5
+ numpy<2.0