BAAI
/

3v324v23 commited on
Commit
9018197
·
1 Parent(s): 8268fab

update reamde

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -119,7 +119,7 @@ with torch.inference_mode():
119
 
120
 
121
  peak_memory_allocated = torch.cuda.max_memory_allocated()
122
- print(f"Memory Peak: {peak_memory_allocated / (1024**3):.2f} GB") # 转换为GB
123
  print(response)
124
  ```
125
 
 
119
 
120
 
121
  peak_memory_allocated = torch.cuda.max_memory_allocated()
122
+ print(f"Memory Peak: {peak_memory_allocated / (1024**3):.2f} GB")
123
  print(response)
124
  ```
125