nina-m-m commited on
Commit
bf0f149
β€’
1 Parent(s): 67ca9ea

Update README.md and refactor code snippets

Browse files
README.md CHANGED
@@ -8,6 +8,26 @@ tags:
8
  ---
9
 
10
  # ECG2HRV
11
- Pipeline for the processing of heart rate data from raw ECG signals (.csv file) towards HRV features.
12
 
13
- For more details see [HUBII](https://hubii.world/)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  ---
9
 
10
  # ECG2HRV
11
+ Pipeline for the processing of heart rate data from raw ECG signals towards HRV features. For more details see [HUBII](https://hubii.world/)
12
 
13
+ ## How to use
14
+ For importing the model in your project, you can use the following code:
15
+ ```python
16
+ # Imports
17
+ from huggingface_hub import hf_hub_download
18
+ import joblib
19
+
20
+ # Define parameters
21
+ REPO_ID = "HUBII-Platform/ECG2HRV"
22
+ FILENAME = "feature-extractor.joblib"
23
+
24
+ # Load the model
25
+ model = joblib.load(
26
+ hf_hub_download(repo_id=REPO_ID, filename=FILENAME)
27
+ )
28
+ ```
29
+ Example usage of the model:
30
+ ```python
31
+ # TBD: Execute the model with the input data
32
+ result = model.forward(...)
33
+ ```
config.json DELETED
File without changes
notebooks/{01_Demo_Pipeline_Usage_In_Code.ipynb β†’ 01_Model_Deployment_Research.ipynb} RENAMED
@@ -3,6 +3,18 @@
3
  {
4
  "cell_type": "markdown",
5
  "source": [
 
 
 
 
 
 
 
 
 
 
 
 
6
  "- https://huggingface.co/docs/transformers/custom_models - Alternative creating custom models\n",
7
  "- https://huggingface.co/templates/feature-extraction - Template for inference API\n",
8
  "- https://huggingface-widgets.netlify.app/ - Widgets for visualizing models in inference API\n",
@@ -12,6 +24,15 @@
12
  "collapsed": false
13
  }
14
  },
 
 
 
 
 
 
 
 
 
15
  {
16
  "cell_type": "code",
17
  "execution_count": 12,
@@ -32,7 +53,7 @@
32
  {
33
  "cell_type": "markdown",
34
  "source": [
35
- "# Using timm to extract features"
36
  ],
37
  "metadata": {
38
  "collapsed": false
@@ -136,7 +157,7 @@
136
  {
137
  "cell_type": "markdown",
138
  "source": [
139
- "# Using transformers to extract features"
140
  ],
141
  "metadata": {
142
  "collapsed": false
@@ -309,7 +330,7 @@
309
  {
310
  "cell_type": "markdown",
311
  "source": [
312
- "2. AutoModel"
313
  ],
314
  "metadata": {
315
  "collapsed": false
@@ -369,7 +390,7 @@
369
  {
370
  "cell_type": "markdown",
371
  "source": [
372
- "# Using simple download\n",
373
  "(See https://huggingface.co/julien-c/wine-quality?structured_data=%7B%7D)"
374
  ],
375
  "metadata": {
@@ -378,11 +399,27 @@
378
  },
379
  {
380
  "cell_type": "code",
381
- "execution_count": null,
382
  "outputs": [],
383
  "source": [
384
- "from huggingface_hub import hf_hub_url, cached_download\n",
385
- "import joblib"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
386
  ],
387
  "metadata": {
388
  "collapsed": false
@@ -390,15 +427,99 @@
390
  },
391
  {
392
  "cell_type": "code",
393
- "execution_count": null,
394
  "outputs": [],
395
  "source": [
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
396
  "REPO_ID = \"HUBII-Platform/ECG2HRV\"\n",
397
  "FILENAME = \"feature-extractor.joblib\"\n",
398
  "\n",
399
- "model = joblib.load(cached_download(\n",
400
- " hf_hub_url(REPO_ID, FILENAME)\n",
401
- "))\n"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
402
  ],
403
  "metadata": {
404
  "collapsed": false
 
3
  {
4
  "cell_type": "markdown",
5
  "source": [
6
+ "# Introduction\n",
7
+ "This notebook is used to test different ways to deploy the feature extraction model to the huggingface hub. The goal is to find the best way to deploy the model so that it can be used in the inference API and can be easy accessible for user. In the best way it would also be possible to simply use the huggingface library directly. The following methods will be tested:\n",
8
+ "\n",
9
+ "1. [Using ``timm`` to extract features](#Using-timm-to-extract-features) -> ❌\n",
10
+ "2. [Using ``transformers`` to extract features](#Using-transformers-to-extract-features) --> ❌\n",
11
+ " 1. [Feature extraction task](#Feature-extraction-task)\n",
12
+ " 2. [``AutoModel``](#AutoModel)\n",
13
+ " 3. [Batched feature extraction](#Batched-feature-extraction)\n",
14
+ "3. [Using simple download](#Using-simple-download) -> ✅\n",
15
+ "4. [Using custom model](#Using-custom-model) --> 🚧\n",
16
+ "\n",
17
+ "**Helpful links and resources**\n",
18
  "- https://huggingface.co/docs/transformers/custom_models - Alternative creating custom models\n",
19
  "- https://huggingface.co/templates/feature-extraction - Template for inference API\n",
20
  "- https://huggingface-widgets.netlify.app/ - Widgets for visualizing models in inference API\n",
 
24
  "collapsed": false
25
  }
26
  },
27
+ {
28
+ "cell_type": "markdown",
29
+ "source": [
30
+ "# Imports"
31
+ ],
32
+ "metadata": {
33
+ "collapsed": false
34
+ }
35
+ },
36
  {
37
  "cell_type": "code",
38
  "execution_count": 12,
 
53
  {
54
  "cell_type": "markdown",
55
  "source": [
56
+ "# 1. Using ``timm`` to extract features"
57
  ],
58
  "metadata": {
59
  "collapsed": false
 
157
  {
158
  "cell_type": "markdown",
159
  "source": [
160
+ "# 2. Using ``transformers`` to extract features"
161
  ],
162
  "metadata": {
163
  "collapsed": false
 
330
  {
331
  "cell_type": "markdown",
332
  "source": [
333
+ "2. ``AutoModel``"
334
  ],
335
  "metadata": {
336
  "collapsed": false
 
390
  {
391
  "cell_type": "markdown",
392
  "source": [
393
+ "# 3. Using simple download\n",
394
  "(See https://huggingface.co/julien-c/wine-quality?structured_data=%7B%7D)"
395
  ],
396
  "metadata": {
 
399
  },
400
  {
401
  "cell_type": "code",
402
+ "execution_count": 25,
403
  "outputs": [],
404
  "source": [
405
+ "from huggingface_hub import hf_hub_download\n",
406
+ "import joblib\n",
407
+ "import torch\n",
408
+ "\n",
409
+ "from src.model import HR2HRV"
410
+ ],
411
+ "metadata": {
412
+ "collapsed": false,
413
+ "ExecuteTime": {
414
+ "end_time": "2024-02-21T11:39:25.775871100Z",
415
+ "start_time": "2024-02-21T11:39:25.755838Z"
416
+ }
417
+ }
418
+ },
419
+ {
420
+ "cell_type": "markdown",
421
+ "source": [
422
+ "**Instantiate model and save the model as a joblib file in the huggingface repository**"
423
  ],
424
  "metadata": {
425
  "collapsed": false
 
427
  },
428
  {
429
  "cell_type": "code",
430
+ "execution_count": 27,
431
  "outputs": [],
432
  "source": [
433
+ "# Instantiate model\n",
434
+ "model = HR2HRV()\n",
435
+ "# Save\n",
436
+ "joblib.dump(model, \"..\\ECG2HRV.joblib\")\n",
437
+ "# Load in notebook\n",
438
+ "model = joblib.load(\"..\\ECG2HRV.joblib\")"
439
+ ],
440
+ "metadata": {
441
+ "collapsed": false,
442
+ "ExecuteTime": {
443
+ "end_time": "2024-02-21T12:01:28.600527100Z",
444
+ "start_time": "2024-02-21T12:01:28.580278200Z"
445
+ }
446
+ }
447
+ },
448
+ {
449
+ "cell_type": "markdown",
450
+ "source": [
451
+ "**Test if the model can be loaded from the hub and used**"
452
+ ],
453
+ "metadata": {
454
+ "collapsed": false
455
+ }
456
+ },
457
+ {
458
+ "cell_type": "code",
459
+ "execution_count": 19,
460
+ "outputs": [],
461
+ "source": [
462
+ "# Load from hub\n",
463
  "REPO_ID = \"HUBII-Platform/ECG2HRV\"\n",
464
  "FILENAME = \"feature-extractor.joblib\"\n",
465
  "\n",
466
+ "model = joblib.load(\n",
467
+ " hf_hub_download(repo_id=REPO_ID, filename=FILENAME)\n",
468
+ ")"
469
+ ],
470
+ "metadata": {
471
+ "collapsed": false,
472
+ "ExecuteTime": {
473
+ "end_time": "2024-02-21T11:36:52.302912800Z",
474
+ "start_time": "2024-02-21T11:36:52.145834500Z"
475
+ }
476
+ }
477
+ },
478
+ {
479
+ "cell_type": "code",
480
+ "execution_count": 20,
481
+ "outputs": [],
482
+ "source": [
483
+ "# Create a example tensor input\n",
484
+ "tensor = torch.tensor([2.0, 3.0, 4.0])"
485
+ ],
486
+ "metadata": {
487
+ "collapsed": false,
488
+ "ExecuteTime": {
489
+ "end_time": "2024-02-21T11:36:55.990475100Z",
490
+ "start_time": "2024-02-21T11:36:55.989181100Z"
491
+ }
492
+ }
493
+ },
494
+ {
495
+ "cell_type": "code",
496
+ "execution_count": 24,
497
+ "outputs": [
498
+ {
499
+ "name": "stdout",
500
+ "output_type": "stream",
501
+ "text": [
502
+ "{'rmssd': tensor(1.), 'mean_rr': tensor(3.), 'sdnn': tensor(1.), 'mean_hr': tensor(20000.)}\n"
503
+ ]
504
+ }
505
+ ],
506
+ "source": [
507
+ "# Run forward pass\n",
508
+ "output = model.forward(tensor)\n",
509
+ "print(output)\n"
510
+ ],
511
+ "metadata": {
512
+ "collapsed": false,
513
+ "ExecuteTime": {
514
+ "end_time": "2024-02-21T11:37:12.239246200Z",
515
+ "start_time": "2024-02-21T11:37:12.219556200Z"
516
+ }
517
+ }
518
+ },
519
+ {
520
+ "cell_type": "markdown",
521
+ "source": [
522
+ "# 4. Using custom model (not tested yet)\n"
523
  ],
524
  "metadata": {
525
  "collapsed": false
pipeline.py β†’ src/pipeline.py RENAMED
File without changes
pipeline_wrapper.py β†’ src/pipeline_wrapper.py RENAMED
File without changes