-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathatom.xml
More file actions
327 lines (190 loc) · 203 KB
/
atom.xml
File metadata and controls
327 lines (190 loc) · 203 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>hydaiの空想世界</title>
<link href="https://hyd.ai/atom.xml" rel="self"/>
<link href="https://hyd.ai/"/>
<updated>2025-08-08T13:56:37.275Z</updated>
<id>https://hyd.ai/</id>
<author>
<name>hydai</name>
</author>
<generator uri="https://hexo.io/">Hexo</generator>
<entry>
<title>how I use AI in daily life 2025/08</title>
<link href="https://hyd.ai/2025/08/08/how-I-use-AI-in-daily-life/"/>
<id>https://hyd.ai/2025/08/08/how-I-use-AI-in-daily-life/</id>
<published>2025-08-08T13:52:34.000Z</published>
<updated>2025-08-08T13:56:37.275Z</updated>
<content type="html"><![CDATA[<img src="/2025/08/08/how-I-use-AI-in-daily-life/header.png" class=""><h2 id="我如何在生活中使用-AI"><a href="#我如何在生活中使用-AI" class="headerlink" title="我如何在生活中使用 AI"></a>我如何在生活中使用 AI</h2><p>先列出目前我正在訂閱與使用的 AI 服務們:</p><ol><li>ChatGPT Plus (20USD/月)</li><li>Claude 20x Max (200USD/月),預計降回 5x Max (100USD/月)</li><li>Gemini AI Pro (690NTD/月),預計試用期結束後不再使用</li><li>OpenAI API Key (100USD/月)</li><li>Grok (X 訂閱就有的使用量)</li><li>Local LLM (算電費/月)</li></ol><span id="more"></span><p>在 Claude Code 剛出來的時候,由於一定得串接 API,在 token 燒下去可能是個無底洞的前提下,眼睛一閉,一百美元丟水溝的壓力實在太大了,在那時的我並沒有意識到 AI 能為生活帶來特別顯著的影響。</p><p>上面那些花費,到底我獲得了什麼?</p><p>讓我們從 ChatGPT 先來,他一開放可以購買我就長期買到現在了,主要是可以拿來提供很高的情緒支持,我特別弄了一個誇誇 GPT,是的,雖然是個成熟的大人了,但有時也想被好好地誇獎一下(よしよし、いい子だね)。而近期的更新中,在圖片產生的部分也都表現不錯。</p><p>接著來到 Claude,我看重的是他寫程式的能力,可以讓我給他一些點子,然後快速長出一個 artifact 的頁面,讓我可以拿去跟其他人進行溝通。除此之外,有些懶得看 man page 的東西也都會丟上去,直接問該怎麼下指令來達成目標。</p><p>Gemini AI Pro 是因為有四個月免費試用,我發現他產生影片的效果很不錯,但也就這樣了,Gemini 的部分並沒有給我足夠的驚豔,由於 gemini 的 context windows 特別大,現在主要的用途也就使用 gemini-cli 去幫我讀一些文件後總結給我,目前的使用量大概免費的 gemini-cli 即可滿足,因此後續會退訂。</p><p>而 OpenAI API Key 主要是串接到 n8n 上面幫我做一些日常可以被自動化的事情,比如:</p><ol><li>將語音、圖片、發票、PDF 等發給 telegram,後續轉送到 n8n 與 AI 協作轉成 beancount 格式儲存。最後每日、每週、每月寄送報表給我。</li><li>每天自動抓取特定的 RSS 來源,並產生電子報通知我有哪些特別有趣的文章。</li><li>針對有興趣的 Podcast ,自動下載最新集數,然後送到 Whisper 轉成文字後,做成雙語版本與簡介後再送給我。</li><li>當我的推結束直播後,自動下載 VOD,Whisper 將逐字稿生出來,透過 LLM 分析哪一段可能是精彩片段,後續接 ffmpeg 快速把切片做好。再忙,都不能錯過推的精彩時刻(=゚ω゚)ノ</li></ol><p>Grok 懂的都懂,管制比較少,可以討論相對政治不正確的事物。</p><p>在這樣的前提下,我一直不停地思考與整理怎麼使用 AI 進行加速,畢竟製造焦慮的人太多了,每個人都想賣你課程,割你韭菜,我累了,被那些噁心的焦慮弄到身心俱疲。</p><p>好在時間過得太快,當 Claude Code 說可以使用同樣的訂閱就有限速吃到飽的時候,一試成主顧。試想當你的生命被工作所佔據,有些靈感迸發的時候卻覺得沒力氣去實現,那該有多無奈?而 Claude Code 現在是我生命中重要的一環,在任何閒暇時間突然有什麼點子或者需要一個短期的髒東西來解決事情,過去得自己花時間寫,現在直接「就決定是你了,上吧!Claude 」。 Token 燒下去,Code 長出來,<del>海帶發大財(並沒有)</del></p><p>感謝 Claude Code 讓我可以在打遊戲、追劇、或者放鬆的時候,拿起手機,遠端發送命令,讓他先幫我做幾個 MVP ,等到我回到辦公桌前,快速審核成果以後決定下一步該怎麼做。有點全球工廠 7/24 工作不停歇的感覺。至於為什麼想降回 5x Max ,主要是因為使用量被嚴重限縮,目前的配套是用 Opus 進行規劃以後,切成其他平台的 Sonnet 來協助我完成剩餘工作。</p><p>如果你還沒試過 AI 幫你做事情,寫程式方面我個人推薦使用 gemini-cli (免費體驗,pro 有限量用完幫你自動接 flash,都不用錢)。或者 Amazon Q developer 可以用,也是有一定的免費使用量。出張嘴,讓 AI 替你跑斷腿ψ(`∇´)ψ</p>]]></content>
<summary type="html"><img src="/2025/08/08/how-I-use-AI-in-daily-life/header.png" class="">
<h2 id="我如何在生活中使用-AI"><a href="#我如何在生活中使用-AI" class="headerlink" title="我如何在生活中使用 AI"></a>我如何在生活中使用 AI</h2><p>先列出目前我正在訂閱與使用的 AI 服務們:</p>
<ol>
<li>ChatGPT Plus (20USD&#x2F;月)</li>
<li>Claude 20x Max (200USD&#x2F;月),預計降回 5x Max (100USD&#x2F;月)</li>
<li>Gemini AI Pro (690NTD&#x2F;月),預計試用期結束後不再使用</li>
<li>OpenAI API Key (100USD&#x2F;月)</li>
<li>Grok (X 訂閱就有的使用量)</li>
<li>Local LLM (算電費&#x2F;月)</li>
</ol></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="LLM" scheme="https://hyd.ai/tags/LLM/"/>
<category term="AI" scheme="https://hyd.ai/tags/AI/"/>
<category term="ChatGPT" scheme="https://hyd.ai/tags/ChatGPT/"/>
<category term="Grok" scheme="https://hyd.ai/tags/Grok/"/>
<category term="Claude" scheme="https://hyd.ai/tags/Claude/"/>
</entry>
<entry>
<title>My Dream PC is finally here</title>
<link href="https://hyd.ai/2025/07/04/my-dream-pc-is-finally-here-2025/"/>
<id>https://hyd.ai/2025/07/04/my-dream-pc-is-finally-here-2025/</id>
<published>2025-07-04T15:12:55.000Z</published>
<updated>2025-07-04T17:31:07.642Z</updated>
<content type="html"><![CDATA[<img src="/2025/07/04/my-dream-pc-is-finally-here-2025/01-dream-pc.png" class=""><h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>我有個夢想,擁有一台能高速編譯大型專案、能執行大型語言模型(LLM)、能打 4K 60 FPS 加上光追特效全開遊戲的電腦。</p><p>而這個夢想終於在今年實現了!我組了一台夢寐以求規格的夢幻神機,這是段有點漫長的故事,請讓我娓娓道來。</p><p>封面的圖片是因為我很喜歡的 FF14 在 7.x 版本出了競技場,大家都戲稱他是「站著的 GPU 」(如下圖),所以我原本是請 ChatGPT 幫我以這個概念設計新的電腦,結果後來店家的設計與這個概念很接近讓我非常開心。</p><img src="/2025/07/04/my-dream-pc-is-finally-here-2025/02-ff14-gpu.jpg" class=""><span id="more"></span><h2 id="目前的主力機"><a href="#目前的主力機" class="headerlink" title="目前的主力機"></a>目前的主力機</h2><p>在這台新電腦到來之前,我的主力機猶如忒修斯之船(Ship of Theseus),從 2017 年開始組裝,每段時間會升級一些零件:</p><p>最初組裝時的規格如下:</p><ul><li><strong>CPU</strong>: Intel Core i5, 小時候不懂事想說打遊戲有 GTX 1080 Ti 就夠了QQ</li><li><strong>GPU</strong>: NVIDIA GeForce GTX 1080 Ti</li><li><strong>RAM</strong>: 金士頓 32GB DDR4</li><li>剩下規格記不得了</li></ul><p>在 AMD Yes 的時代,因為身邊的朋友用了 AMD 3900X 後都說非常優秀,傻子才買 Intel,讓還在用 i5 的我心癢難耐。</p><p>於是看到 2020 年 AMD Ryzen 5000 系列推出後,果斷決定要升級,跟朋友一起熱血跑去(當年還沒有加價的)原價屋排了首發。但很可惜的是,有了 AMD Ryzen 3000 系列的優秀表現,人人都想要 5900X ,等輪到我的時候收到消息是沒貨了,可以先預約等到貨,或者捏著直上 5950X。但當時的我還是把 PC 當成遊戲用機,5950X 規格過於奢華,也用不到,於是乖乖排隊等了 5900X。幸虧,很快就有貨,我記得才過兩個禮拜原價屋就打來說到貨了可以去拿。</p><p>改版第一次的規格如下:</p><ul><li><strong>CPU</strong>: AMD Ryzen 9 5900X</li><li><strong>主機板</strong>: ASUS ROG Crosshair VIII Hero (WI-FI)</li><li><strong>RAM</strong>: G.Skill 32GB x 2 DDR4</li><li><strong>GPU</strong>: NVIDIA GeForce GTX 1080 Ti</li><li><strong>SSD</strong>: Samsung 970 Pro 1TB NVMe SSD</li><li><strong>電源供應器</strong>: 忘記了,但是我記得因為當時想說 5900X 加上 1080 Ti 這樣的組合應該不會太吃電,所以買了相對較小的電源供應器。倒是後面的一場悲劇讓我後悔不已。</li></ul><p>過了一段時間後,來到 2021 年,我發現 1080 Ti 已經沒辦法在主流的 3A 遊戲裏使用 4K 畫質進行遊玩,那顆想升級的心終究開始躁動。可惜遇到挖礦潮,一直等不到想要的顯卡(和顯卡價格),直到真的忍受不住糟糕的遊戲體驗,決定用略微盤子的價格購買了 ASUS ROG-STRIX-RTX3080TI-O12G-GAMING,當時要價 43999 元,我記得過沒多久礦難後價格就一直掉了,時不我與,沒料到礦難發生後,顯卡價格會掉這麼快。也在這次同時添購了 Samsung 980 Pro 1TB NVMe SSD,幫自己的硬碟再加碼一顆。</p><p>改版第二次的規格如下:</p><ul><li><strong>CPU</strong>: AMD Ryzen 9 5900X</li><li><strong>CPU 散熱器</strong>: Noctua NH-D15</li><li><strong>主機板</strong>: ASUS ROG Crosshair VIII Hero (WI-FI)</li><li><strong>RAM</strong>: G.Skill 32GB x 2 DDR4</li><li><strong>GPU</strong>: NVIDIA GeForce RTX 3080 Ti</li><li><strong>SSD</strong>: Samsung 970 Pro 1TB NVMe SSD</li><li><strong>SSD</strong>: Samsung 980 Pro 1TB NVMe SSD</li><li><strong>電源供應器</strong>: 忘記了,但就是這次升級發生了悲劇。</li></ul><p>本來都還好,在平常的工作負載下其實 GPU 並沒有消耗很多的功耗,直到我發現在打 FF14 時,有一個最初主線必須通過的溶洞四人副本,對,印象這麼深刻是因為我在這關重打了不下 60 次,每次都會突然螢幕完全黑掉沒反應,然後電腦會自動重開機。當時我並沒有認知到是電源供應器故障,還以為是驅動程式有問題,或者是風冷壓不住 CPU 的溫度。這時我的好朋友說他手上有一組沒在用的水冷,可以送我試試看。</p><p>改版第三次的規格如下:</p><ul><li><strong>CPU</strong>: AMD Ryzen 9 5900X</li><li><strong>CPU 散熱器</strong>: NZXT Kraken Z53 RGB 液晶水冷 240mm</li><li><strong>主機板</strong>: ASUS ROG Crosshair VIII Hero (WI-FI)</li><li><strong>RAM</strong>: G.Skill 32GB x 2 DDR4</li><li><strong>GPU</strong>: NVIDIA GeForce RTX 3080 Ti</li><li><strong>SSD</strong>: Samsung 970 Pro 1TB NVMe SSD</li><li><strong>SSD</strong>: Samsung 980 Pro 1TB NVMe SSD</li><li><strong>電源供應器</strong>: 忘記了,但就是悲劇的兇手本人。</li></ul><p>這次換了水冷後,電腦的溫度確實有下降,但還是會在打 FF14 時突然失去螢幕訊號。在排除一切不可能之後,各位看官,你想想,我整套電腦也只剩下那顆電源供應器沒換了,於是把之前買 3080 Ti 送的 ASUS ROG Strix 750W 換上,正義與善良都回來了。</p><p>改版第四次的規格如下:</p><ul><li><strong>CPU</strong>: AMD Ryzen 9 5900X</li><li><strong>CPU 散熱器</strong>: NZXT Kraken Z53 RGB 液晶水冷 240mm</li><li><strong>主機板</strong>: ASUS ROG Crosshair VIII Hero (WI-FI)</li><li><strong>RAM</strong>: G.Skill 32GB x 2 DDR4</li><li><strong>GPU</strong>: NVIDIA GeForce RTX 3080 Ti</li><li><strong>SSD</strong>: Samsung 990 Pro 1TB NVMe SSD # 偷偷把 970 升級成 990 ,印象中是美亞大特價</li><li><strong>SSD</strong>: Samsung 980 Pro 1TB NVMe SSD</li><li><strong>電源供應器</strong>: ASUS ROG Strix 750W</li></ul><p>這就是我為什麼稱他為忒修斯之船的原因,因為這台電腦從 2017 年組裝到現在,CPU 的改朝換代只留下了那張 1080 Ti,其他零件都換過了。而到了後來甚至連 1080 Ti 也被替換掉,現在的主力機已經完全不是當初組裝的那台電腦了。</p><h2 id="為什麼要組裝新電腦?"><a href="#為什麼要組裝新電腦?" class="headerlink" title="為什麼要組裝新電腦?"></a>為什麼要組裝新電腦?</h2><h3 id="她依舊是那良人,是我變了"><a href="#她依舊是那良人,是我變了" class="headerlink" title="她依舊是那良人,是我變了"></a>她依舊是那良人,是我變了</h3><p>這台電腦迄今沒有任何讓我不滿意的地方,除了她的性能已經跟不上我的需求。</p><p>時間來到 2024 年,其實一直很想升級 GPU,主要的理由就是好多新遊戲若想要 4K+光追,代價就是 FPS < 60,由於我喜歡的某些遊戲沒辦法使用這樣規格來呈現使我格外痛苦,原本想買 4090 ,但想等他價格低一點的時候, 50 系列就出來了。</p><p>以當初買這台電腦的目的,從遊戲專用機器轉變成生產力工具,需求不再一樣,論遊戲表現,她依舊是我的良人,能夠調降各種參數後在 4K 畫質下打 60 FPS 的遊戲,但我已經不再是當初的我了。我的胃口被 LLM 的世界養大了,是我不好,外面的世界太讓人著迷了。除了更優秀的遊戲表現,我也期待能執行本機端的大型語言模型(LLM),這樣就能在本機端進行各種 AI 相關的實驗,而不必依賴雲端服務。</p><h3 id="時不我與的悲哀"><a href="#時不我與的悲哀" class="headerlink" title="時不我與的悲哀"></a>時不我與的悲哀</h3><p>然而也因為 AI 的崛起,顯卡價格已經高到讓人無法接受,更誇張的是顯卡猶如期貨,除了必須忍受那高昂的價格外,還得等非常久的時間才能拿到手。這讓我在考慮升級顯卡時猶如在賭博,因為我不知道等到的會是什麼時候,價格會不會更高,或者是等到的時候已經有更好的顯卡了。</p><p>很多時候消費是一種衝動,在一個失眠的夜晚,我決定要去原價屋下單,準備成為盤子的一員,就在此時加價屋事件爆發。我一時間不知道應該相信哪個店家,而且我要的規格比較高級,不找加價屋的話,似乎也有點難湊齊。這時我有考慮過的一個店家,是 Linus Tech Tips 2024 年來台灣時,造訪光華商場的某家客製化水冷店家,畢竟看到 Linus 的影片介紹,覺得他們應該品質不錯,但當時滿腔熱血的我已經被加價屋事件澆熄,便沒有去詢問。</p><h3 id="過於夢幻的規格"><a href="#過於夢幻的規格" class="headerlink" title="過於夢幻的規格"></a>過於夢幻的規格</h3><p>在這段時間,我一直在思考我的需求,想要一台能夠高速編譯大型專案、執行大型語言模型(LLM)、打 4K 60FPS 加上光追特效全開遊戲的電腦。這樣的需求讓我對規格有了更高的要求:</p><ul><li>AMD 9950X3D</li><li>DDR5 Memory 192GB(48GB*4) (全部插滿,我知道會有代價,但似乎可能有解決方案)</li><li>NVIDIA 5090 32GB</li><li>GEN5 SSD</li><li>CPU 與 GPU 都要水冷</li></ul><h3 id="突如其來的緣分"><a href="#突如其來的緣分" class="headerlink" title="突如其來的緣分"></a>突如其來的緣分</h3><p>剛好跟朋友討論我的需求(是的,在外靠朋友,我也沒想到個人電腦之路有這麼多朋友幫忙),他說有認識的在做客製化水冷(分體式),恰巧我以前都用一體式水冷,熱情死灰復燃,又是一個月黑風高的夜晚,在失眠的衝動下就決定要開始這次的購買計劃。</p><p>我是很相信朋友的,也很相信以前信任過的品牌,這點可以從我的購買習慣看見:</p><ol><li>一堆人說三星會吃冷筍,但我從用了三星的 SSD 至今都沒出過事,因此還是會繼續購買。</li><li>幾乎全套 ROG 系列的零件,因為當初用 3900X 的朋友說那張板子好,我就連顯卡都只考慮他們家的了。</li><li>之前的一體式水冷也是朋友買了覺得好用,只是因為其他因素換成別的方案,所以送我,如果這次沒採用分體式水冷的方案,我可能會考慮 NZXT 的一體式水冷。</li></ol><h3 id="一段漫長的旅程"><a href="#一段漫長的旅程" class="headerlink" title="一段漫長的旅程"></a>一段漫長的旅程</h3><p>以下會是一段流水帳,記錄我從 2025 年 2 月開始到 2025 年 07 月 02 日取貨的購買過程,請注意,我並無苛責店家的意圖,而且漫長的等待是有價值的,更因為我相信朋友的推薦,這店家不會有問題。</p><p>2025 年 2 月 26 日,跟店家說明我的需求,當初跟他提要 192GB 記憶體時,店家也很開宗明義地表示 DDR5 插滿的前提下可能沒辦法有最好的效能,也有提出 96GB (48GB x 2 雙通道) 不夠用嗎?但是我怕那個萬一,還記得當初就是省了一個電源供應器的錢,我在一個四人副本裡迷航了六十幾次,我怕了,我真的對此感到畏懼。萬一未來出現更讚的 AI 模型對於記憶體需求又增加了呢?或者有其他應用要吃更多的記憶體呢?還是捏一下直接塞滿比較安心。<br>2025 年 3 月 2 日,店家說會盡快給我報價單,接著就默默來到 4 月,我人已經從東京回台灣後才想到:「啊我生日都過完了,電腦好像還沒看到報價單?」<br>2025 年 4 月 2 日就在這個念頭產生的那天,正在準備掃墓的我收到了來自店家的報價單,確認了規格以後,店家表示 RTX 5090 的水冷頭是特規硬體,付完訂金以後約 30 天才會到,我想說既然要等那麼久,馬上就把訂金付了,既然都要買了,就別等,不然等等黨肯定會等到明年。<br>2025 年 4 月 18 日,因為即將到巴黎出差,回國已經是五月中了,本著錢不欠過年的原則,我聯繫店家說既然水冷頭月底到貨,五月初就能完成,那提前把剩下的款項都先付清吧。殊不知驚喜就在這邊XD<br>2025 年 5 月 12 日,時間過得很快,在我拜訪聖心堂的同時,還很開心地以為能夠一回國就有新電腦摸,想想就興奮,於是問了一下店家進度,這時得到回覆是「GPU 水冷頭跟記憶體還沒到貨」,看來還是得再等等。我覺得這個驚喜可能與川普的關稅戰有關,但這只是我個人猜測,剛好貨物延遲的時間跟那個關稅戰時程吻合。當然,這只是我的推測,並沒有任何證據。<br>2025 年 6 月 5 日,時光匆匆,這天我在 YouTube 上看到 Linus 參加 Computex 時組了一台 5090 神機的影片,頓時垂死病中驚坐起,「我電腦呢OAO?!」,於是再次詢問店家,得到的答覆是「雖然水冷頭到貨了,但記憶體還沒到,因為原廠 5 月底才從工廠出貨,至今還沒到台灣」。沒事,我是很有耐心的海帶,都等了這麼久了,還不能再等等嗎o(`ω´ )o<br>2025 年 6 月 17 日,失眠的我看到身邊開始德亞啟動、法亞啟動的朋友都從國外拿到 50 系顯卡,開心打遊戲,還被問:「海帶,你那台新電腦效能怎麼樣?跟我們交流交流」。我也是只能很無奈表示:供應商出貨慢,讓店家沒辦法組裝出貨,真的很難過。不得不說現在要組電腦得等很久,從 2025 年 4 月 2 日到今天 2025 年 6 月 17 日,已經兩個半月了,很多零件都得排隊等待,真懷念以前零件不缺貨,買了馬上能開箱組裝的美好歲月QQ<br>2025 年 6 月 17 日,是的,就在同一天,在我把上面的流水帳記錄下來後,剛好就收到店家的聯繫,說記憶體終於到貨了,會盡快幫我組裝,並且會在組裝完成後通知我。而且我要的那組 192GB (48GB x 4) 的記憶體,店家全台只有 15 組,真的等很久,也是通知我的前天才剛到貨,真是太幸運了。<br>2025 年 6 月 27 日,店家通知我組裝完成了,並開始進行燒機測試。<br>2025 年 7 月 2 日,店家送貨給我的時候,我已經提前在一樓大廳坐著等三十分鐘了,迫不及待想要看到我的新電腦了,終於等到這一天。</p><h2 id="組裝規格"><a href="#組裝規格" class="headerlink" title="組裝規格"></a>組裝規格</h2><p>最後的規格:</p><ul><li><strong>CPU</strong>: AMD Ryzen 9 9950X3D</li><li><strong>主機板</strong>: MSI MEG X870E GODLIKE</li><li><strong>RAM</strong>: Biwin DW100 RGB D5-6000 192GB (48GB x 4)</li><li><strong>GPU</strong>: NVIDIA RTX 5090 Vanguard</li><li><strong>機殼</strong>:LIAN LI O11D EVO XL</li><li><strong>電源供應器</strong>:全漢 HYDRO PTM PRO 1350W 白金牌 全模組</li><li><strong>CPU 水冷頭</strong>: Alphacool APEX 1 AM5 CPU冷頭</li><li><strong>GPU 水冷頭</strong>: Bykski N-MS5090DTRIO-X 顯示卡水冷頭</li><li><strong>SSD</strong>: Samsung 9100 Pro 2TB NVMe SSD</li><li><strong>SSD</strong>: Samsung 9100 Pro 4TB NVMe SSD</li></ul><h2 id="收到貨當下"><a href="#收到貨當下" class="headerlink" title="收到貨當下"></a>收到貨當下</h2><p>他在門口的樣子</p><img src="/2025/07/04/my-dream-pc-is-finally-here-2025/03-my-pc-in-box.jpg" class=""><p>收到的實際開箱,很抱歉現在還不能給,因為我正好在重新規劃家裡的空間,因此過於雜亂沒有拍實體照,但大家可以到店家的 FB 上看到我電腦的實體照片:</p><ul><li><a href="https://www.facebook.com/share/p/1FcL4c223T/">https://www.facebook.com/share/p/1FcL4c223T/</a></li><li><a href="https://www.facebook.com/share/p/1DziEP8QJP/">https://www.facebook.com/share/p/1DziEP8QJP/</a></li></ul><h2 id="實際表現"><a href="#實際表現" class="headerlink" title="實際表現"></a>實際表現</h2><h3 id="噪音"><a href="#噪音" class="headerlink" title="噪音"></a>噪音</h3><p>有夠安靜,即使是在滿負載的狀態下,GPU 的熱度約為 50 度,然後我把溫度計放在機殼上方測量大概 30 度左右,溫度表現很不錯,重點是超級安靜。</p><p>跟誰比呢?跟我的 3080 Ti 風冷,在滿載的狀態下,GPU 的熱度為 80 度,然後那個噪音應該有 65 分貝左右,反正是個一定得帶著耳機做事的風噪聲。</p><h3 id="效能"><a href="#效能" class="headerlink" title="效能"></a>效能</h3><h4 id="編譯速度"><a href="#編譯速度" class="headerlink" title="編譯速度"></a>編譯速度</h4><p>我個人經常編譯的專案可以在 1 分鐘內完成,痛快無比。大型專案如 LLVM 還沒測試,但肯定是比 5900X 快了不少。</p><img src="/2025/07/04/my-dream-pc-is-finally-here-2025/04-compilation-test.jpg" class=""><h4 id="LLM-推論"><a href="#LLM-推論" class="headerlink" title="LLM 推論"></a>LLM 推論</h4><p>Gemma 3 27b qat 的表現如下:</p><img src="/2025/07/04/my-dream-pc-is-finally-here-2025/05-gemma3-27b-it-qat.jpg" class=""><p>Gemma 3 27b q4-k-m 的表現如下:</p><img src="/2025/07/04/my-dream-pc-is-finally-here-2025/06-gemma3-27b-it-q4-k-m.jpg" class=""><p>Qwen 3 30b q4-k-m 的表現如下:</p><img src="/2025/07/04/my-dream-pc-is-finally-here-2025/07-qwen3-30b-q4-k-m.jpg" class=""><p>我很是滿意,這個速度我可以在本機端進行許多 AI Agent 相關的實驗,減少對雲端服務的依賴。</p><h2 id="結語"><a href="#結語" class="headerlink" title="結語"></a>結語</h2><p>等待是有價值的,我很開心這次是由朋友 <a href="https://aotoki.me/">蒼時弦也</a> 推薦的店家,雖然是段漫長的等待,過程中受到供應商供貨的波折,但最終的結果讓我非常滿意。</p><h2 id="附錄"><a href="#附錄" class="headerlink" title="附錄"></a>附錄</h2><p>如果你也想客製化這樣的電腦,根據目前使用上的體驗與品質,我推薦這家店 - <a href="https://www.facebook.com/ZATPCtw">捷特水冷電腦</a></p><p>祝福想組新電腦的人都能有足夠的幸運,不會受供貨影響而等待很久,並且能夠組裝出自己心目中的夢幻神機。</p>]]></content>
<summary type="html"><img src="/2025/07/04/my-dream-pc-is-finally-here-2025/01-dream-pc.png" class="">
<h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>我有個夢想,擁有一台能高速編譯大型專案、能執行大型語言模型(LLM)、能打 4K 60 FPS 加上光追特效全開遊戲的電腦。</p>
<p>而這個夢想終於在今年實現了!我組了一台夢寐以求規格的夢幻神機,這是段有點漫長的故事,請讓我娓娓道來。</p>
<p>封面的圖片是因為我很喜歡的 FF14 在 7.x 版本出了競技場,大家都戲稱他是「站著的 GPU 」(如下圖),所以我原本是請 ChatGPT 幫我以這個概念設計新的電腦,結果後來店家的設計與這個概念很接近讓我非常開心。</p>
<img src="/2025/07/04/my-dream-pc-is-finally-here-2025/02-ff14-gpu.jpg" class=""></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="LLM" scheme="https://hyd.ai/tags/LLM/"/>
<category term="NVIDIA" scheme="https://hyd.ai/tags/NVIDIA/"/>
<category term="PC Build" scheme="https://hyd.ai/tags/PC-Build/"/>
<category term="AMD" scheme="https://hyd.ai/tags/AMD/"/>
<category term="Gaming" scheme="https://hyd.ai/tags/Gaming/"/>
</entry>
<entry>
<title>在 NVIDIA Jetson Orin AGX 上編譯 llama.cpp 與部署 AI 應用</title>
<link href="https://hyd.ai/2025/03/07/llamacpp-on-jetson-orin-agx/"/>
<id>https://hyd.ai/2025/03/07/llamacpp-on-jetson-orin-agx/</id>
<published>2025-03-07T05:28:00.000Z</published>
<updated>2025-03-07T12:13:13.900Z</updated>
<content type="html"><![CDATA[<img src="/2025/03/07/llamacpp-on-jetson-orin-agx/img.jpg" class=""><h2 id="簡介"><a href="#簡介" class="headerlink" title="簡介"></a>簡介</h2><p>由於工作的緣故,需要將 llama.cpp 作為 WASI-NN 的後端所使用,來讓 WebAssembly 能具備使用 AI 模型的能力,也因此需要在各種平台上編譯 llama.cpp 作為我們的相依性函式庫。<br>而 NVIDIA Jetson Orin AGX 64GB 的版本,作為提供相對大的 VRAM 與支援 CUDA 的平台,自然是我們花許多力氣在上面進行測試與最佳化的目標。<br>本文將詳細記錄如何在 NVIDIA Jetson Orin AGX (JetPack 6.2) 上成功編譯 llama.cpp、將大型語言模型轉換成 GGUF 格式、進行模型量化以及最終部署 AI 應用的完整流程。</p><span id="more"></span><h2 id="升級至-JetPack-6-2"><a href="#升級至-JetPack-6-2" class="headerlink" title="升級至 JetPack 6.2"></a>升級至 JetPack 6.2</h2><p>不論你使用的是 NVIDIA Jetson Orin AGX 或者 Jetson Orin Nano,都強烈建議你升級到 JetPack 6.2 以上的版本,以確保你能夠使用最新的 CUDA 與與解鎖後的效能,同時這個版本提供了 Ubuntu 22.04 與 CUDA 12.6 的環境,具備更好的支援性。<br>至於怎麼安裝,根據不同的型號,操作可能有所差異,請<a href="https://docs.nvidia.com/jetson/jetpack/install-setup/index.html">參考 NVIDIA 官方文件</a>,以取得最新的安裝方式,這裡不再贅述。</p><h2 id="編譯-llama-cpp"><a href="#編譯-llama-cpp" class="headerlink" title="編譯 llama.cpp"></a>編譯 llama.cpp</h2><h3 id="安裝相依性套件"><a href="#安裝相依性套件" class="headerlink" title="安裝相依性套件"></a>安裝相依性套件</h3><p>若升級到 JetPack 6.2 以上,CUDA 12.6 的工具鏈已經預先安裝在系統中,不過還是需要安裝一些相依性套件,以確保編譯 llama.cpp 時不會遇到問題。</p><ul><li><code>build-essential</code>: 安裝編譯工具鏈,如: GCC/G++</li><li><code>cmake</code>: 安裝 CMake 編譯工具,目前 llama.cpp 已經改為使用 CMake 進行編譯</li><li><code>git</code>: 安裝 Git 版本控制工具,用來下載 llama.cpp 的原始碼</li></ul><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">sudo</span> apt update <span class="comment"># 更新套件清單</span></span><br><span class="line"><span class="built_in">sudo</span> apt upgrade -y <span class="comment"># 升級所有已安裝的套件</span></span><br><span class="line"><span class="built_in">sudo</span> apt install -y build-essential cmake git <span class="comment"># 安裝編譯工具鏈、CMake 與 Git</span></span><br></pre></td></tr></table></figure><h3 id="下載-llama-cpp"><a href="#下載-llama-cpp" class="headerlink" title="下載 llama.cpp"></a>下載 llama.cpp</h3><p>如果你以前下載過 llama.cpp 的 repo ,他的位置已經從 <code>https://github.com/ggerganov/llama.cpp.git</code> 轉移到 <code>https://github.com/ggml-org/llama.cpp.git</code> 囉,請記得更新你的連結。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">git <span class="built_in">clone</span> https://github.com/ggml-org/llama.cpp.git</span><br><span class="line"><span class="built_in">cd</span> llama.cpp</span><br></pre></td></tr></table></figure><h3 id="編譯-llama-cpp-1"><a href="#編譯-llama-cpp-1" class="headerlink" title="編譯 llama.cpp"></a>編譯 llama.cpp</h3><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">cmake -B build -DGGML_CUDA=ON <span class="comment"># 在 build 目錄下產生編譯檔案</span></span><br><span class="line">cmake --build build --parallel <span class="comment"># 平行編譯,榨乾該機器上的所有 CPU 資源</span></span><br></pre></td></tr></table></figure><p>以我的 Jetson Orin AGX 為例,編譯 llama.cpp 大約需要 30 分鐘左右,視機器效能而定。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">cmake --build build --parallel 9873.06s user 246.00s system 645% cpu 26:07.55 total</span><br></pre></td></tr></table></figure><p>編譯完成後,在 <code>build/bin</code> 中會產生 llama.cpp 的所有可執行檔案。</p><h4 id="若發生-CUDA-相關的編譯錯誤"><a href="#若發生-CUDA-相關的編譯錯誤" class="headerlink" title="若發生 CUDA 相關的編譯錯誤"></a>若發生 CUDA 相關的編譯錯誤</h4><p>如下:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br></pre></td><td class="code"><pre><span class="line">-- Found CUDAToolkit: /usr/local/cuda/include (found version <span class="string">"12.6.68"</span>)</span><br><span class="line">-- CUDA Toolkit found</span><br><span class="line">-- Using CUDA architectures: 50;61;70;75;80</span><br><span class="line">CMake Error at /usr/share/cmake-3.22/Modules/CMakeDetermineCompilerId.cmake:726 (message):</span><br><span class="line"> Compiling the CUDA compiler identification <span class="built_in">source</span> file</span><br><span class="line"> <span class="string">"CMakeCUDACompilerId.cu"</span> failed.</span><br><span class="line"></span><br><span class="line"> Compiler: CMAKE_CUDA_COMPILER-NOTFOUND</span><br><span class="line"></span><br><span class="line"> Build flags:</span><br><span class="line"></span><br><span class="line"> Id flags: -v</span><br><span class="line"></span><br><span class="line"></span><br><span class="line"></span><br><span class="line"> The output was:</span><br><span class="line"></span><br><span class="line"> No such file or directory</span><br><span class="line"></span><br><span class="line">Call Stack (most recent call first):</span><br><span class="line"> /usr/share/cmake-3.22/Modules/CMakeDetermineCompilerId.cmake:6 (CMAKE_DETERMINE_COMPILER_ID_BUILD)</span><br><span class="line"> /usr/share/cmake-3.22/Modules/CMakeDetermineCompilerId.cmake:48 (__determine_compiler_id_test)</span><br><span class="line"> /usr/share/cmake-3.22/Modules/CMakeDetermineCUDACompiler.cmake:298 (CMAKE_DETERMINE_COMPILER_ID)</span><br><span class="line"> ggml/src/ggml-cuda/CMakeLists.txt:25 (enable_language)</span><br><span class="line"></span><br><span class="line">-- Configuring incomplete, errors occurred!</span><br></pre></td></tr></table></figure><p>這是因為 CUDA 12.6 的編譯器路徑不正確,需要手動設定 CUDA 編譯器路徑,請執行以下指令:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">export</span> PATH=/usr/local/cuda-12.6/bin<span class="variable">${PATH:+:<span class="variable">${PATH}</span>}</span></span><br><span class="line"><span class="built_in">export</span> LD_LIBRARY_PATH=/usr/local/cuda-12.6/lib64\</span><br><span class="line"> <span class="variable">${LD_LIBRARY_PATH:+:<span class="variable">${LD_LIBRARY_PATH}</span>}</span></span><br></pre></td></tr></table></figure><p>接著重新執行上述的編譯指令即可解決。</p><h2 id="模型轉換為-GGUF-格式"><a href="#模型轉換為-GGUF-格式" class="headerlink" title="模型轉換為 GGUF 格式"></a>模型轉換為 GGUF 格式</h2><p>多數的模型並不是以 GGUF 的格式進行發布的,因此在使用 llama.cpp 執行之前,需要先將模型轉換成 GGUF 格式才能夠正確載入。<br>當然在 Hugging Face 上已經有不少人轉換好的 GGUF 模型,如果你是下載那些已經轉換的檔案就可以跳過此步驟。</p><h3 id="下載原始模型"><a href="#下載原始模型" class="headerlink" title="下載原始模型"></a>下載原始模型</h3><p>此處以 <a href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-8B">DeepSeek-R1-Distill-Llama-8B</a> 為例,你可以換成任何你喜歡的模型。</p><h4 id="使用-huggingface-cli-下載"><a href="#使用-huggingface-cli-下載" class="headerlink" title="使用 huggingface-cli 下載"></a>使用 huggingface-cli 下載</h4><p>由於下載需要不少時間,建議使用 <a href="https://huggingface.co/docs/huggingface_hub/en/guides/cli">huggingface-cli</a> 來下載模型。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">huggingface-cli download deepseek-ai/DeepSeek-R1-Distill-Llama-8B --local-dir ds-r1-distill-llama-8b</span><br><span class="line"><span class="comment"># huggingface-cli download <org/repo> --local-dir <存放在本機端的路徑></span></span><br></pre></td></tr></table></figure><h4 id="安裝轉換模型用的相依性函式庫"><a href="#安裝轉換模型用的相依性函式庫" class="headerlink" title="安裝轉換模型用的相依性函式庫"></a>安裝轉換模型用的相依性函式庫</h4><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment"># 在 llama.cpp 的根目錄下執行</span></span><br><span class="line">pip install -e .</span><br></pre></td></tr></table></figure><h4 id="轉換模型"><a href="#轉換模型" class="headerlink" title="轉換模型"></a>轉換模型</h4><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">python convert_hf_to_gguf.py --outfile ./ds-r1-distill-llama-8b ./ds-r1-distill-llama-8b</span><br><span class="line"><span class="comment"># python convert_hf_to_gguf.py --outfile <輸出檔案資料夾> <模型所在資料夾></span></span><br></pre></td></tr></table></figure><p>以下為執行的 log 僅供參考:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br><span class="line">33</span><br><span class="line">34</span><br><span class="line">35</span><br><span class="line">36</span><br><span class="line">37</span><br><span class="line">38</span><br><span class="line">39</span><br><span class="line">40</span><br><span class="line">41</span><br><span class="line">42</span><br><span class="line">43</span><br><span class="line">44</span><br><span class="line">45</span><br><span class="line">46</span><br><span class="line">47</span><br><span class="line">48</span><br><span class="line">49</span><br><span class="line">50</span><br></pre></td><td class="code"><pre><span class="line">INFO:hf-to-gguf:Loading model: ds-r1-distill-llama-8b</span><br><span class="line">INFO:gguf.gguf_writer:gguf: This GGUF file is <span class="keyword">for</span> Little Endian only</span><br><span class="line">INFO:hf-to-gguf:Exporting model...</span><br><span class="line">INFO:hf-to-gguf:rope_freqs.weight, torch.float32 --> F32, shape = {64}</span><br><span class="line">INFO:hf-to-gguf:gguf: loading model weight map from <span class="string">'model.safetensors.index.json'</span></span><br><span class="line">INFO:hf-to-gguf:gguf: loading model part <span class="string">'model-00001-of-000002.safetensors'</span></span><br><span class="line">INFO:hf-to-gguf:token_embd.weight, torch.bfloat16 --> F16, shape = {4096, 128256}</span><br><span class="line">INFO:hf-to-gguf:blk.0.attn_norm.weight, torch.bfloat16 --> F32, shape = {4096}</span><br><span class="line">INFO:hf-to-gguf:blk.0.ffn_down.weight, torch.bfloat16 --> F16, shape = {14336, 4096}</span><br><span class="line">INFO:hf-to-gguf:blk.0.ffn_gate.weight, torch.bfloat16 --> F16, shape = {4096, 14336}</span><br><span class="line">INFO:hf-to-gguf:blk.0.ffn_up.weight, torch.bfloat16 --> F16, shape = {4096, 14336}</span><br><span class="line">INFO:hf-to-gguf:blk.0.ffn_norm.weight, torch.bfloat16 --> F32, shape = {4096}</span><br><span class="line">INFO:hf-to-gguf:blk.0.attn_k.weight, torch.bfloat16 --> F16, shape = {4096, 1024}</span><br><span class="line">INFO:hf-to-gguf:blk.0.attn_output.weight, torch.bfloat16 --> F16, shape = {4096, 4096}</span><br><span class="line">INFO:hf-to-gguf:blk.0.attn_q.weight, torch.bfloat16 --> F16, shape = {4096, 4096}</span><br><span class="line">INFO:hf-to-gguf:blk.0.attn_v.weight, torch.bfloat16 --> F16, shape = {4096, 1024}</span><br><span class="line">...中間省略...</span><br><span class="line">INFO:hf-to-gguf:blk.31.attn_norm.weight, torch.bfloat16 --> F32, shape = {4096}</span><br><span class="line">INFO:hf-to-gguf:blk.31.ffn_down.weight, torch.bfloat16 --> F16, shape = {14336, 4096}</span><br><span class="line">INFO:hf-to-gguf:blk.31.ffn_gate.weight, torch.bfloat16 --> F16, shape = {4096, 14336}</span><br><span class="line">INFO:hf-to-gguf:blk.31.ffn_up.weight, torch.bfloat16 --> F16, shape = {4096, 14336}</span><br><span class="line">INFO:hf-to-gguf:blk.31.ffn_norm.weight, torch.bfloat16 --> F32, shape = {4096}</span><br><span class="line">INFO:hf-to-gguf:blk.31.attn_k.weight, torch.bfloat16 --> F16, shape = {4096, 1024}</span><br><span class="line">INFO:hf-to-gguf:blk.31.attn_output.weight, torch.bfloat16 --> F16, shape = {4096, 4096}</span><br><span class="line">INFO:hf-to-gguf:blk.31.attn_q.weight, torch.bfloat16 --> F16, shape = {4096, 4096}</span><br><span class="line">INFO:hf-to-gguf:blk.31.attn_v.weight, torch.bfloat16 --> F16, shape = {4096, 1024}</span><br><span class="line">INFO:hf-to-gguf:output_norm.weight, torch.bfloat16 --> F32, shape = {4096}</span><br><span class="line">INFO:hf-to-gguf:Set meta model</span><br><span class="line">INFO:hf-to-gguf:Set model parameters</span><br><span class="line">INFO:hf-to-gguf:gguf: context length = 131072</span><br><span class="line">INFO:hf-to-gguf:gguf: embedding length = 4096</span><br><span class="line">INFO:hf-to-gguf:gguf: feed forward length = 14336</span><br><span class="line">INFO:hf-to-gguf:gguf: <span class="built_in">head</span> count = 32</span><br><span class="line">INFO:hf-to-gguf:gguf: key-value <span class="built_in">head</span> count = 8</span><br><span class="line">INFO:hf-to-gguf:gguf: rope theta = 500000.0</span><br><span class="line">INFO:hf-to-gguf:gguf: rms norm epsilon = 1e-05</span><br><span class="line">INFO:hf-to-gguf:gguf: file <span class="built_in">type</span> = 1</span><br><span class="line">INFO:hf-to-gguf:Set model tokenizer</span><br><span class="line">INFO:gguf.vocab:Adding 280147 merge(s).</span><br><span class="line">INFO:gguf.vocab:Setting special token <span class="built_in">type</span> bos to 128000</span><br><span class="line">INFO:gguf.vocab:Setting special token <span class="built_in">type</span> eos to 128001</span><br><span class="line">INFO:gguf.vocab:Setting special token <span class="built_in">type</span> pad to 128001</span><br><span class="line">INFO:gguf.vocab:Setting add_bos_token to True</span><br><span class="line">INFO:gguf.vocab:Setting add_eos_token to False</span><br><span class="line">INFO:gguf.vocab:Setting chat_template to {% <span class="keyword">if</span> not add_generation_prompt is defined %}{% <span class="built_in">set</span> add_generation_prompt = <span class="literal">false</span> %}{% endif %}{% <span class="built_in">set</span> ns = namespace(is_first=<span class="literal">false</span>, is_tool=<span class="literal">false</span>, is_output_first=<span class="literal">true</span>, system_prompt=<span class="string">''</span>) %}{%- <span class="keyword">for</span> message <span class="keyword">in</span> messages %}{%- <span class="keyword">if</span> message[<span class="string">'role'</span>] == <span class="string">'system'</span> %}{% <span class="built_in">set</span> ns.system_prompt = message[<span class="string">'content'</span>] %}{%- endif %}{%- endfor %}{{bos_token}}{{ns.system_prompt}}{%- <span class="keyword">for</span> message <span class="keyword">in</span> messages %}{%- <span class="keyword">if</span> message[<span class="string">'role'</span>] == <span class="string">'user'</span> %}{%- <span class="built_in">set</span> ns.is_tool = <span class="literal">false</span> -%}{{<span class="string">'<|User|>'</span> + message[<span class="string">'content'</span>]}}{%- endif %}{%- <span class="keyword">if</span> message[<span class="string">'role'</span>] == <span class="string">'assistant'</span> and message[<span class="string">'content'</span>] is none %}{%- <span class="built_in">set</span> ns.is_tool = <span class="literal">false</span> -%}{%- <span class="keyword">for</span> tool <span class="keyword">in</span> message[<span class="string">'tool_calls'</span>]%}{%- <span class="keyword">if</span> not ns.is_first %}{{<span class="string">'<|Assistant|><|tool▁calls▁begin|><|tool▁call▁begin|>'</span> + tool[<span class="string">'type'</span>] + <span class="string">'<|tool▁sep|>'</span> + tool[<span class="string">'function'</span>][<span class="string">'name'</span>] + <span class="string">'\n'</span> + <span class="string">'```json'</span> + <span class="string">'\n'</span> + tool[<span class="string">'function'</span>][<span class="string">'arguments'</span>] + <span class="string">'\n'</span> + <span class="string">'```'</span> + <span class="string">'<|tool▁call▁end|>'</span>}}{%- <span class="built_in">set</span> ns.is_first = <span class="literal">true</span> -%}{%- <span class="keyword">else</span> %}{{<span class="string">'\n'</span> + <span class="string">'<|tool▁call▁begin|>'</span> + tool[<span class="string">'type'</span>] + <span class="string">'<|tool▁sep|>'</span> + tool[<span class="string">'function'</span>][<span class="string">'name'</span>] + <span class="string">'\n'</span> + <span class="string">'```json'</span> + <span class="string">'\n'</span> + tool[<span class="string">'function'</span>][<span class="string">'arguments'</span>] + <span class="string">'\n'</span> + <span class="string">'```'</span> + <span class="string">'<|tool▁call▁end|>'</span>}}{{<span class="string">'<|tool▁calls▁end|><|end▁of▁sentence|>'</span>}}{%- endif %}{%- endfor %}{%- endif %}{%- <span class="keyword">if</span> message[<span class="string">'role'</span>] == <span class="string">'assistant'</span> and message[<span class="string">'content'</span>] is not none %}{%- <span class="keyword">if</span> ns.is_tool %}{{<span class="string">'<|tool▁outputs▁end|>'</span> + message[<span class="string">'content'</span>] + <span class="string">'<|end▁of▁sentence|>'</span>}}{%- <span class="built_in">set</span> ns.is_tool = <span class="literal">false</span> -%}{%- <span class="keyword">else</span> %}{% <span class="built_in">set</span> content = message[<span class="string">'content'</span>] %}{% <span class="keyword">if</span> <span class="string">'</think>'</span> <span class="keyword">in</span> content %}{% <span class="built_in">set</span> content = content.split(<span class="string">'</think>'</span>)[-1] %}{% endif %}{{<span class="string">'<|Assistant|>'</span> + content + <span class="string">'<|end▁of▁sentence|>'</span>}}{%- endif %}{%- endif %}{%- <span class="keyword">if</span> message[<span class="string">'role'</span>] == <span class="string">'tool'</span> %}{%- <span class="built_in">set</span> ns.is_tool = <span class="literal">true</span> -%}{%- <span class="keyword">if</span> ns.is_output_first %}{{<span class="string">'<|tool▁outputs▁begin|><|tool▁output▁begin|>'</span> + message[<span class="string">'content'</span>] + <span class="string">'<|tool▁output▁end|>'</span>}}{%- <span class="built_in">set</span> ns.is_output_first = <span class="literal">false</span> %}{%- <span class="keyword">else</span> %}{{<span class="string">'\n<|tool▁output▁begin|>'</span> + message[<span class="string">'content'</span>] + <span class="string">'<|tool▁output▁end|>'</span>}}{%- endif %}{%- endif %}{%- endfor -%}{% <span class="keyword">if</span> ns.is_tool %}{{<span class="string">'<|tool▁outputs▁end|>'</span>}}{% endif %}{% <span class="keyword">if</span> add_generation_prompt and not ns.is_tool %}{{<span class="string">'<|Assistant|><think>\n'</span>}}{% endif %}</span><br><span class="line">INFO:hf-to-gguf:Set model quantization version</span><br><span class="line">INFO:gguf.gguf_writer:Writing the following files:</span><br><span class="line">INFO:gguf.gguf_writer:/disk/models/ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-F16.gguf: n_tensors = 292, total_size = 16.1G</span><br><span class="line">Writing: 100%|███████████████████████████████████████████████████████████████████████████████████| 16.1G/16.1G [01:14<00:00, 216Mbyte/s]</span><br><span class="line">INFO:hf-to-gguf:Model successfully exported to /disk/models/ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-F16.gguf</span><br></pre></td></tr></table></figure><p>在轉換後我們可以初步看到模型的大小關係,基本上沒有太大的變化:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br></pre></td><td class="code"><pre><span class="line"># 轉換前的模型大小 (8.1G + 6.9G = 15G)</span><br><span class="line">8.1G model-00001-of-000002.safetensors</span><br><span class="line">6.9G model-00002-of-000002.safetensors</span><br><span class="line"># 轉換後的 F16 GGUF</span><br><span class="line">15G ds-r1-distill-llama-8B-F16.gguf</span><br></pre></td></tr></table></figure><h2 id="模型量化"><a href="#模型量化" class="headerlink" title="模型量化"></a>模型量化</h2><p>由於眾所周知的原因(大家都很窮),部署完整版(F16)的多數模型實在過於龐大,除了因為大小的緣故很難被放入消費級的硬體外,還容易因為硬體效能限制導致推理速度不理想。這時候我們就會需要透過量化,雖然降低精度,卻也能保證模型被縮小到可以較容易部署的大小,並能獲得更高的推理速度,達到更好的可用性。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment"># 在 llama.cpp 的根目錄下執行</span></span><br><span class="line">./build/bin/llama-quantize ./ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-F16.gguf ./ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-Q4_0.gguf q4_0</span><br><span class="line"><span class="comment"># ./build/bin/llama-quantize <f16 gguf 檔案路徑> <量化後的 gguf 檔案路徑> <量化的方法></span></span><br></pre></td></tr></table></figure><p>僅供參考的 log:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br><span class="line">33</span><br><span class="line">34</span><br><span class="line">35</span><br><span class="line">36</span><br><span class="line">37</span><br><span class="line">38</span><br><span class="line">39</span><br><span class="line">40</span><br><span class="line">41</span><br><span class="line">42</span><br><span class="line">43</span><br><span class="line">44</span><br><span class="line">45</span><br><span class="line">46</span><br><span class="line">47</span><br><span class="line">48</span><br><span class="line">49</span><br><span class="line">50</span><br><span class="line">51</span><br><span class="line">52</span><br><span class="line">53</span><br><span class="line">54</span><br><span class="line">55</span><br><span class="line">56</span><br><span class="line">57</span><br><span class="line">58</span><br><span class="line">59</span><br><span class="line">60</span><br><span class="line">61</span><br><span class="line">62</span><br><span class="line">63</span><br><span class="line">64</span><br><span class="line">65</span><br><span class="line">66</span><br><span class="line">67</span><br><span class="line">68</span><br></pre></td><td class="code"><pre><span class="line">main: build = 4845 (d6c95b07)</span><br><span class="line">main: built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for aarch64-linux-gnu</span><br><span class="line">main: quantizing '/disk/models/ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-F16.gguf' to '/disk/models/ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-Q4_0.gguf' as Q4_0</span><br><span class="line">llama_model_loader: loaded meta data with 29 key-value pairs and 292 tensors from /disk/models/ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-F16.gguf (version GGUF V3 (latest))</span><br><span class="line">llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.</span><br><span class="line">llama_model_loader: - kv 0: general.architecture str = llama</span><br><span class="line">llama_model_loader: - kv 1: general.type str = model</span><br><span class="line">llama_model_loader: - kv 2: general.name str = Ds R1 Distill Llama 8b</span><br><span class="line">llama_model_loader: - kv 3: general.basename str = ds-r1-distill-llama</span><br><span class="line">llama_model_loader: - kv 4: general.size_label str = 8B</span><br><span class="line">llama_model_loader: - kv 5: general.license str = mit</span><br><span class="line">llama_model_loader: - kv 6: llama.block_count u32 = 32</span><br><span class="line">llama_model_loader: - kv 7: llama.context_length u32 = 131072</span><br><span class="line">llama_model_loader: - kv 8: llama.embedding_length u32 = 4096</span><br><span class="line">llama_model_loader: - kv 9: llama.feed_forward_length u32 = 14336</span><br><span class="line">llama_model_loader: - kv 10: llama.attention.head_count u32 = 32</span><br><span class="line">llama_model_loader: - kv 11: llama.attention.head_count_kv u32 = 8</span><br><span class="line">llama_model_loader: - kv 12: llama.rope.freq_base f32 = 500000.000000</span><br><span class="line">llama_model_loader: - kv 13: llama.attention.layer_norm_rms_epsilon f32 = 0.000010</span><br><span class="line">llama_model_loader: - kv 14: general.file_type u32 = 1</span><br><span class="line">llama_model_loader: - kv 15: llama.vocab_size u32 = 128256</span><br><span class="line">llama_model_loader: - kv 16: llama.rope.dimension_count u32 = 128</span><br><span class="line">llama_model_loader: - kv 17: tokenizer.ggml.model str = gpt2</span><br><span class="line">llama_model_loader: - kv 18: tokenizer.ggml.pre str = llama-bpe</span><br><span class="line">llama_model_loader: - kv 19: tokenizer.ggml.tokens arr[str,128256] = ["!", "\"", "#", "$", "%", "&", "'", ...</span><br><span class="line">llama_model_loader: - kv 20: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...</span><br><span class="line">llama_model_loader: - kv 21: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "...</span><br><span class="line">llama_model_loader: - kv 22: tokenizer.ggml.bos_token_id u32 = 128000</span><br><span class="line">llama_model_loader: - kv 23: tokenizer.ggml.eos_token_id u32 = 128001</span><br><span class="line">llama_model_loader: - kv 24: tokenizer.ggml.padding_token_id u32 = 128001</span><br><span class="line">llama_model_loader: - kv 25: tokenizer.ggml.add_bos_token bool = true</span><br><span class="line">llama_model_loader: - kv 26: tokenizer.ggml.add_eos_token bool = false</span><br><span class="line">llama_model_loader: - kv 27: tokenizer.chat_template str = {% if not add_generation_prompt is de...</span><br><span class="line">llama_model_loader: - kv 28: general.quantization_version u32 = 2</span><br><span class="line">llama_model_loader: - type f32: 66 tensors</span><br><span class="line">llama_model_loader: - type f16: 226 tensors</span><br><span class="line">ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no</span><br><span class="line">ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no</span><br><span class="line">ggml_cuda_init: found 1 CUDA devices:</span><br><span class="line"> Device 0: Orin, compute capability 8.7, VMM: yes</span><br><span class="line">[ 1/ 292] output.weight - [ 4096, 128256, 1, 1], type = f16, converting to q6_K .. size = 1002.00 MiB -> 410.98 MiB</span><br><span class="line">[ 2/ 292] output_norm.weight - [ 4096, 1, 1, 1], type = f32, size = 0.016 MB</span><br><span class="line">[ 3/ 292] rope_freqs.weight - [ 64, 1, 1, 1], type = f32, size = 0.000 MB</span><br><span class="line">[ 4/ 292] token_embd.weight - [ 4096, 128256, 1, 1], type = f16, converting to q4_0 .. size = 1002.00 MiB -> 281.81 MiB</span><br><span class="line">[ 5/ 292] blk.0.attn_k.weight - [ 4096, 1024, 1, 1], type = f16, converting to q4_0 .. size = 8.00 MiB -> 2.25 MiB</span><br><span class="line">[ 6/ 292] blk.0.attn_norm.weight - [ 4096, 1, 1, 1], type = f32, size = 0.016 MB</span><br><span class="line">[ 7/ 292] blk.0.attn_output.weight - [ 4096, 4096, 1, 1], type = f16, converting to q4_0 .. size = 32.00 MiB -> 9.00 MiB</span><br><span class="line">[ 8/ 292] blk.0.attn_q.weight - [ 4096, 4096, 1, 1], type = f16, converting to q4_0 .. size = 32.00 MiB -> 9.00 MiB</span><br><span class="line">[ 9/ 292] blk.0.attn_v.weight - [ 4096, 1024, 1, 1], type = f16, converting to q4_0 .. size = 8.00 MiB -> 2.25 MiB</span><br><span class="line">[ 10/ 292] blk.0.ffn_down.weight - [14336, 4096, 1, 1], type = f16, converting to q4_0 .. size = 112.00 MiB -> 31.50 MiB</span><br><span class="line">[ 11/ 292] blk.0.ffn_gate.weight - [ 4096, 14336, 1, 1], type = f16, converting to q4_0 .. size = 112.00 MiB -> 31.50 MiB</span><br><span class="line">[ 12/ 292] blk.0.ffn_norm.weight - [ 4096, 1, 1, 1], type = f32, size = 0.016 MB</span><br><span class="line">[ 13/ 292] blk.0.ffn_up.weight - [ 4096, 14336, 1, 1], type = f16, converting to q4_0 .. size = 112.00 MiB -> 31.50 MiB</span><br><span class="line">...中間省略...</span><br><span class="line">[ 284/ 292] blk.31.attn_k.weight - [ 4096, 1024, 1, 1], type = f16, converting to q4_0 .. size = 8.00 MiB -> 2.25 MiB</span><br><span class="line">[ 285/ 292] blk.31.attn_norm.weight - [ 4096, 1, 1, 1], type = f32, size = 0.016 MB</span><br><span class="line">[ 286/ 292] blk.31.attn_output.weight - [ 4096, 4096, 1, 1], type = f16, converting to q4_0 .. size = 32.00 MiB -> 9.00 MiB</span><br><span class="line">[ 287/ 292] blk.31.attn_q.weight - [ 4096, 4096, 1, 1], type = f16, converting to q4_0 .. size = 32.00 MiB -> 9.00 MiB</span><br><span class="line">[ 288/ 292] blk.31.attn_v.weight - [ 4096, 1024, 1, 1], type = f16, converting to q4_0 .. size = 8.00 MiB -> 2.25 MiB</span><br><span class="line">[ 289/ 292] blk.31.ffn_down.weight - [14336, 4096, 1, 1], type = f16, converting to q4_0 .. size = 112.00 MiB -> 31.50 MiB</span><br><span class="line">[ 290/ 292] blk.31.ffn_gate.weight - [ 4096, 14336, 1, 1], type = f16, converting to q4_0 .. size = 112.00 MiB -> 31.50 MiB</span><br><span class="line">[ 291/ 292] blk.31.ffn_norm.weight - [ 4096, 1, 1, 1], type = f32, size = 0.016 MB</span><br><span class="line">[ 292/ 292] blk.31.ffn_up.weight - [ 4096, 14336, 1, 1], type = f16, converting to q4_0 .. size = 112.00 MiB -> 31.50 MiB</span><br><span class="line">llama_model_quantize_impl: model size = 15317.02 MB</span><br><span class="line">llama_model_quantize_impl: quant size = 4437.80 MB</span><br><span class="line"></span><br><span class="line">main: quantize time = 17884.06 ms</span><br><span class="line">main: total time = 17884.06 ms</span><br></pre></td></tr></table></figure><p>在進行完量化後,模型大小差異如下:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br></pre></td><td class="code"><pre><span class="line">原本為 15GB,量化後為 4.4GB</span><br><span class="line">llama_model_quantize_impl: model size = 15317.02 MB</span><br><span class="line">llama_model_quantize_impl: quant size = 4437.80 MB</span><br></pre></td></tr></table></figure><h2 id="部署-AI-應用"><a href="#部署-AI-應用" class="headerlink" title="部署 AI 應用"></a>部署 AI 應用</h2><h3 id="透過-llama-cli-進行推論"><a href="#透過-llama-cli-進行推論" class="headerlink" title="透過 llama-cli 進行推論"></a>透過 llama-cli 進行推論</h3><p>直接使用 Chat CLI 來與模型進行互動,透過以下指令來進行推論:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">./build/bin/llama-cli -m ./ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-Q4_0.gguf</span><br></pre></td></tr></table></figure><p>然後你會馬上發現,似乎 GPU 沒有被使用到,那就對了,這時會在 log 中發現以下資訊:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br></pre></td><td class="code"><pre><span class="line">load_tensors: loading model tensors, this can take a while... (mmap = true)</span><br><span class="line">load_tensors: offloading 0 repeating layers to GPU</span><br><span class="line">load_tensors: offloaded 0/33 layers to GPU</span><br><span class="line">load_tensors: CPU_AARCH64 model buffer size = 3744.00 MiB</span><br><span class="line">load_tensors: CPU_Mapped model buffer size = 4406.30 MiB</span><br></pre></td></tr></table></figure><p>告訴你他只使用了 CPU 的部分,並沒有將模型放到 GPU 上來進行加速。</p><p>你需要指定 <code>-ngl 層數</code> 來指定要放到 GPU 上的層數,例如:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">./build/bin/llama-cli -m ./ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-Q4_0.gguf -ngl 33</span><br></pre></td></tr></table></figure><p>這時就會看見 GPU 被使用到了:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br></pre></td><td class="code"><pre><span class="line">load_tensors: loading model tensors, this can take a while... (mmap = true)</span><br><span class="line">load_tensors: offloading 32 repeating layers to GPU</span><br><span class="line">load_tensors: offloading output layer to GPU</span><br><span class="line">load_tensors: offloaded 33/33 layers to GPU</span><br><span class="line">load_tensors: CUDA0 model buffer size = 4155.99 MiB</span><br><span class="line">load_tensors: CPU_Mapped model buffer size = 281.81 MiB</span><br></pre></td></tr></table></figure><h3 id="透過-llama-server-啟動-OpenAI-相容的-API-伺服器"><a href="#透過-llama-server-啟動-OpenAI-相容的-API-伺服器" class="headerlink" title="透過 llama-server 啟動 OpenAI 相容的 API 伺服器"></a>透過 llama-server 啟動 OpenAI 相容的 API 伺服器</h3><p>透過以下指令來啟動 llama-server:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">./build/bin/llama-server -m ./ds-r1-distill-llama-8b/ds-r1-distill-llama-8B-Q4_0.gguf -ngl 33</span><br></pre></td></tr></table></figure><p>原則上就是把上面的 <code>cli</code> 換成 <code>server</code> 就可以囉!</p><h4 id="如何與之互動"><a href="#如何與之互動" class="headerlink" title="如何與之互動"></a>如何與之互動</h4><p>請查詢 OpenAI API 該如何使用,你可以直接將任何支援 OpenAI API endpoint 的工具與服務換成 llama-server 啟動的伺服器。</p><p>以下只是示範如何使用 <code>curl</code> 來發送請求,並使用 <code>jq</code> 來渲染輸出結果</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br><span class="line">33</span><br><span class="line">34</span><br><span class="line">35</span><br><span class="line">36</span><br><span class="line">37</span><br><span class="line">38</span><br><span class="line">39</span><br></pre></td><td class="code"><pre><span class="line">curl -X POST http://localhost:8080/v1/chat/completions \</span><br><span class="line"> -H <span class="string">'accept:application/json'</span> \</span><br><span class="line"> -H <span class="string">'Content-Type: application/json'</span> \</span><br><span class="line"> -d <span class="string">'{"messages":[{"role":"system", "content": "You are a helpful assistant. Reply questions in less than five words."}, {"role":"user", "content": "What is the capital of Japan?"}], "model":"default"}'</span> | jq .</span><br><span class="line"></span><br><span class="line"><span class="comment"># 輸出結果</span></span><br><span class="line"></span><br><span class="line">{</span><br><span class="line"> <span class="string">"choices"</span>: [</span><br><span class="line"> {</span><br><span class="line"> <span class="string">"finish_reason"</span>: <span class="string">"stop"</span>,</span><br><span class="line"> <span class="string">"index"</span>: 0,</span><br><span class="line"> <span class="string">"message"</span>: {</span><br><span class="line"> <span class="string">"role"</span>: <span class="string">"assistant"</span>,</span><br><span class="line"> <span class="string">"content"</span>: <span class="string">"<think>\nOkay, so I need to figure out the capital of Japan. Hmm, I'm not entirely sure, but I think it's a country that's pretty well-known, so maybe I can recall it from memory. Let me try to think. I remember that Tokyo is a major city there. Wait, is Tokyo the capital? Or is it another city? I think I've heard that Tokyo is both the capital and the largest city. I'm pretty sure other countries have capitals that are also their biggest cities, like Washington D.C. in the U.S. So applying that logic, Japan's capital should be Tokyo. I don't think it's Osaka or Nagasaki because those are other major cities, but I'm almost certain the capital is Tokyo. Yeah, I feel confident about that.\n</think>\n\nTokyo"</span></span><br><span class="line"> }</span><br><span class="line"> }</span><br><span class="line"> ],</span><br><span class="line"> <span class="string">"created"</span>: 1741332025,</span><br><span class="line"> <span class="string">"model"</span>: <span class="string">"default"</span>,</span><br><span class="line"> <span class="string">"system_fingerprint"</span>: <span class="string">"b4845-d6c95b07"</span>,</span><br><span class="line"> <span class="string">"object"</span>: <span class="string">"chat.completion"</span>,</span><br><span class="line"> <span class="string">"usage"</span>: {</span><br><span class="line"> <span class="string">"completion_tokens"</span>: 167,</span><br><span class="line"> <span class="string">"prompt_tokens"</span>: 24,</span><br><span class="line"> <span class="string">"total_tokens"</span>: 191</span><br><span class="line"> },</span><br><span class="line"> <span class="string">"id"</span>: <span class="string">"chatcmpl-cRBfkobRrFxvmeQxxL6ZYipDB60TSmQr"</span>,</span><br><span class="line"> <span class="string">"timings"</span>: {</span><br><span class="line"> <span class="string">"prompt_n"</span>: 13,</span><br><span class="line"> <span class="string">"prompt_ms"</span>: 806.983,</span><br><span class="line"> <span class="string">"prompt_per_token_ms"</span>: 62.07561538461538,</span><br><span class="line"> <span class="string">"prompt_per_second"</span>: 16.109385203901446,</span><br><span class="line"> <span class="string">"predicted_n"</span>: 167,</span><br><span class="line"> <span class="string">"predicted_ms"</span>: 17762.902,</span><br><span class="line"> <span class="string">"predicted_per_token_ms"</span>: 106.36468263473053,</span><br><span class="line"> <span class="string">"predicted_per_second"</span>: 9.401616920478423</span><br><span class="line"> }</span><br><span class="line">}</span><br></pre></td></tr></table></figure><h2 id="結語"><a href="#結語" class="headerlink" title="結語"></a>結語</h2><p>本文以 <code>llama.cpp b4845 (d6c95b07)</code> 做教學,由於 llama.cpp 是非常活躍的專案,因此上述的工具參數可能有所修改,如發現該指令無法正確執行,請使用 <code>--help</code> 或者 <code>-h</code> 來查看最新的參數應該如何使用。<br>希望本文能讓想使用 NVIDIA Jetson 系列開發版的開發者們,能夠輕鬆上手 llama.cpp,並更快速地部署 AI 應用。</p>]]></content>
<summary type="html"><img src="/2025/03/07/llamacpp-on-jetson-orin-agx/img.jpg" class="">
<h2 id="簡介"><a href="#簡介" class="headerlink" title="簡介"></a>簡介</h2><p>由於工作的緣故,需要將 llama.cpp 作為 WASI-NN 的後端所使用,來讓 WebAssembly 能具備使用 AI 模型的能力,也因此需要在各種平台上編譯 llama.cpp 作為我們的相依性函式庫。<br>而 NVIDIA Jetson Orin AGX 64GB 的版本,作為提供相對大的 VRAM 與支援 CUDA 的平台,自然是我們花許多力氣在上面進行測試與最佳化的目標。<br>本文將詳細記錄如何在 NVIDIA Jetson Orin AGX (JetPack 6.2) 上成功編譯 llama.cpp、將大型語言模型轉換成 GGUF 格式、進行模型量化以及最終部署 AI 應用的完整流程。</p></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="Linux" scheme="https://hyd.ai/tags/Linux/"/>
<category term="LLM" scheme="https://hyd.ai/tags/LLM/"/>
<category term="NVIDIA" scheme="https://hyd.ai/tags/NVIDIA/"/>
<category term="Jetson" scheme="https://hyd.ai/tags/Jetson/"/>
<category term="Orin" scheme="https://hyd.ai/tags/Orin/"/>
<category term="AGX" scheme="https://hyd.ai/tags/AGX/"/>
<category term="JetPack" scheme="https://hyd.ai/tags/JetPack/"/>
<category term="CUDA" scheme="https://hyd.ai/tags/CUDA/"/>
<category term="GGUF" scheme="https://hyd.ai/tags/GGUF/"/>
<category term="llamacpp" scheme="https://hyd.ai/tags/llamacpp/"/>
</entry>
<entry>
<title>在 DigitalOcean 上部署 Flatcar Container Linux + LLM</title>
<link href="https://hyd.ai/2024/12/12/deploy-flatcar-on-do/"/>
<id>https://hyd.ai/2024/12/12/deploy-flatcar-on-do/</id>
<published>2024-12-12T14:31:17.000Z</published>
<updated>2024-12-13T10:41:57.678Z</updated>
<content type="html"><![CDATA[<h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>之前在 Flatcar 的系統擴充元件提了<a href="https://github.com/flatcar/sysext-bakery/pulls?q=is:pr+author:hydai+is:closed">兩個 PRs</a>,讓 <a href="https://github.com/WasmEdge/WasmEdge/">WasmEdge</a> 與 <a href="https://github.com/LlamaEdge/LlamaEdge">LlamaEdge</a> 能以 sysext 的方式在 Flatcar 被部署時,一起被安裝在裡頭。因為跟 DigitalOcean 發生了一些事,因此決定留下筆記以後,準備遷移到 Azure 或其他友善的平台。</p><p>本文將涵蓋:</p><ol><li>如何建立屬於你的 Flatcar Sysext(系統擴充元件)。</li><li>如何在 DigitalOcean 上新增 Flatcar Container Linux 的客製化映像檔案。</li><li>如何在 DigitalOcean 上部署 Flatcar Container Linux。</li><li>如何執行 Wasm 的應用程式(利益揭露,我是 WasmEdge 的維護者)。</li><li>如何部署大語言模型(LLM)在 Flatcar Container Linux 上,使用 LlamaEdge 為例。(利益揭露,這個專案會編譯成 Wasm 格式,並執行在 WasmEdge 之上)。</li></ol><span id="more"></span><h2 id="為什麼選-Flatcar-Container-Linux"><a href="#為什麼選-Flatcar-Container-Linux" class="headerlink" title="為什麼選 Flatcar Container Linux?"></a>為什麼選 Flatcar Container Linux?</h2><p>Flatcar Container Linux 是 CoreOS Container Linux 的後繼者,是一個專為容器化應用程式所設計的作業系統。它的特點是:</p><ul><li>安全:無法使用套件管理系統進行修改,只能透過 sysext 進行擴充。</li><li>針對容器做最佳化:預設只有提供與執行容器相關的套件。減少安裝多餘套件導致被攻擊的機會增加。</li><li>方便的初始化設定:透過 <a href="https://coreos.github.io/ignition/">Ignition</a> 進行初始化設定,讓使用者可以在啟動時就完成想要的初始化設定,包含但不限於:啟動服務、在根檔案系統中建立檔案、重新格式化 <code>/var</code> 檔案系統、使用遠端的設定檔來取代原先的設定檔、新增使用者、修改內核參數等。</li></ul><p>友人 H 表示:「上面太官腔了,說實話好嗎?」<br>我:「實話就是我想把 WasmEdge Sysext 貢獻到 Flatcar 的生態系裡面,當然非選他不可<code>ψ(`∇´)ψ</code>」</p><h2 id="建立你的-Flatcar-系統擴充元件-Sysext"><a href="#建立你的-Flatcar-系統擴充元件-Sysext" class="headerlink" title="建立你的 Flatcar 系統擴充元件 (Sysext)"></a>建立你的 Flatcar 系統擴充元件 (Sysext)</h2><p>Flatcar 官方提供了一個工具 <a href="https://github.com/flatcar/sysext-bakery">sysext-bakery</a>,讓使用者可以建立自己的 sysext 。<br>上面有許多的範例:WasmEdge、Wasmtime、containerd、crio、kubernetes、nvidia-runtime 等等。應有盡有,可以從裡面選喜歡或者類似的為範本來進行修改。</p><h3 id="以-WasmEdge-為例"><a href="#以-WasmEdge-為例" class="headerlink" title="以 WasmEdge 為例"></a>以 WasmEdge 為例</h3><p>整個腳本可以分為以下幾個部分:</p><ol><li>設定 bash 的環境變數。</li><li>檢查參數是否正確,如果不正確就印出使用說明。</li><li>下載與安裝你需要的應用程式。在這個範例中,我們下載了 WasmEdge 的壓縮檔。</li><li>移動檔案到正確的位置,比如執行檔應放置於 <code>/usr/bin/</code> 下,函式庫應放置於 <code>/usr/lib/</code> 下等 。</li><li>執行 <code>bake.sh</code> 來建立 sysext。</li></ol><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br><span class="line">33</span><br><span class="line">34</span><br><span class="line">35</span><br><span class="line">36</span><br><span class="line">37</span><br><span class="line">38</span><br><span class="line">39</span><br><span class="line">40</span><br><span class="line">41</span><br><span class="line">42</span><br><span class="line">43</span><br><span class="line">44</span><br><span class="line">45</span><br><span class="line">46</span><br><span class="line">47</span><br><span class="line">48</span><br><span class="line">49</span><br><span class="line">50</span><br><span class="line">51</span><br><span class="line">52</span><br><span class="line">53</span><br><span class="line">54</span><br></pre></td><td class="code"><pre><span class="line"><span class="meta">#!/usr/bin/env bash</span></span><br><span class="line"><span class="built_in">set</span> -euo pipefail</span><br><span class="line"></span><br><span class="line"><span class="comment"># 設定環境變數</span></span><br><span class="line"><span class="built_in">export</span> ARCH=<span class="string">"<span class="variable">${ARCH-x86-64}</span>"</span></span><br><span class="line">SCRIPTFOLDER=<span class="string">"<span class="subst">$(dirname <span class="string">"<span class="subst">$(readlink -f <span class="string">"<span class="variable">$0</span>"</span>)</span>"</span>)</span>"</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 檢查參數</span></span><br><span class="line"><span class="keyword">if</span> [ <span class="variable">$#</span> -lt 2 ] || [ <span class="string">"<span class="variable">$1</span>"</span> = <span class="string">"-h"</span> ] || [ <span class="string">"<span class="variable">$1</span>"</span> = <span class="string">"--help"</span> ]; <span class="keyword">then</span></span><br><span class="line"> <span class="comment"># 列印使用說明</span></span><br><span class="line"> <span class="built_in">echo</span> <span class="string">"Usage: <span class="variable">$0</span> VERSION SYSEXTNAME"</span></span><br><span class="line"> <span class="built_in">echo</span> <span class="string">"The script will download the WasmEdge release tar ball (e.g., for 0.14.1) and create a sysext squashfs image with the name SYSEXTNAME.raw in the current folder."</span></span><br><span class="line"> <span class="built_in">echo</span> <span class="string">"A temporary directory named SYSEXTNAME in the current folder will be created and deleted again."</span></span><br><span class="line"> <span class="built_in">echo</span> <span class="string">"All files in the sysext image will be owned by root."</span></span><br><span class="line"> <span class="built_in">echo</span> <span class="string">"To use arm64 pass 'ARCH=arm64' as environment variable (current value is '<span class="variable">${ARCH}</span>')."</span></span><br><span class="line"> <span class="string">"<span class="variable">${SCRIPTFOLDER}</span>"</span>/bake.sh --<span class="built_in">help</span></span><br><span class="line"> <span class="built_in">exit</span> 1</span><br><span class="line"><span class="keyword">fi</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 從參數中取得版本號與 sysext 名稱</span></span><br><span class="line">VERSION=<span class="string">"<span class="variable">$1</span>"</span></span><br><span class="line">SYSEXTNAME=<span class="string">"<span class="variable">$2</span>"</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 由於 GitHub release 將會使用不同的硬體架構識別碼,需要在此進行轉換</span></span><br><span class="line"><span class="comment"># amd64 或 x86-64 皆改為 x86_64</span></span><br><span class="line"><span class="comment"># arm64 改為 aarch64</span></span><br><span class="line"><span class="comment"># The github release uses different arch identifiers, we map them here</span></span><br><span class="line"><span class="comment"># and rely on bake.sh to map them back to what systemd expects</span></span><br><span class="line"><span class="keyword">if</span> [ <span class="string">"<span class="variable">${ARCH}</span>"</span> = <span class="string">"amd64"</span> ] || [ <span class="string">"<span class="variable">${ARCH}</span>"</span> = <span class="string">"x86-64"</span> ]; <span class="keyword">then</span></span><br><span class="line"> ARCH=<span class="string">"x86_64"</span></span><br><span class="line"><span class="keyword">elif</span> [ <span class="string">"<span class="variable">${ARCH}</span>"</span> = <span class="string">"arm64"</span> ]; <span class="keyword">then</span></span><br><span class="line"> ARCH=<span class="string">"aarch64"</span></span><br><span class="line"><span class="keyword">fi</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 下載並解壓縮 WasmEdge</span></span><br><span class="line"><span class="built_in">rm</span> -f <span class="string">"WasmEdge-<span class="variable">${VERSION}</span>.tar.gz"</span></span><br><span class="line">curl -o <span class="string">"WasmEdge-<span class="variable">${VERSION}</span>.tar.gz"</span> -fsSL <span class="string">"https://github.com/WasmEdge/WasmEdge/releases/download/<span class="variable">${VERSION}</span>/WasmEdge-<span class="variable">${VERSION}</span>-ubuntu20.04_<span class="variable">${ARCH}</span>.tar.gz"</span></span><br><span class="line"><span class="built_in">rm</span> -rf <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span></span><br><span class="line"><span class="built_in">mkdir</span> -p <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span></span><br><span class="line">tar --force-local -xvf <span class="string">"WasmEdge-<span class="variable">${VERSION}</span>.tar.gz"</span> -C <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span></span><br><span class="line"><span class="built_in">rm</span> <span class="string">"WasmEdge-<span class="variable">${VERSION}</span>.tar.gz"</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 移動檔案到正確的位置</span></span><br><span class="line"><span class="built_in">mkdir</span> -p <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span>/usr/bin</span><br><span class="line"><span class="built_in">mkdir</span> -p <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span>/usr/lib <span class="comment"># for .so files</span></span><br><span class="line"><span class="built_in">mv</span> <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span>/<span class="string">"WasmEdge-<span class="variable">${VERSION}</span>-Linux"</span>/bin/wasmedge <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span>/usr/bin/</span><br><span class="line"><span class="built_in">mv</span> <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span>/<span class="string">"WasmEdge-<span class="variable">${VERSION}</span>-Linux"</span>/lib/* <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span>/usr/lib/</span><br><span class="line"><span class="built_in">rm</span> -r <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span>/<span class="string">"WasmEdge-<span class="variable">${VERSION}</span>-Linux"</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 執行 bake.sh 來建立 sysext</span></span><br><span class="line"><span class="string">"<span class="variable">${SCRIPTFOLDER}</span>"</span>/bake.sh <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 清理暫存檔</span></span><br><span class="line"><span class="built_in">rm</span> -rf <span class="string">"<span class="variable">${SYSEXTNAME}</span>"</span></span><br></pre></td></tr></table></figure><h3 id="執行腳本"><a href="#執行腳本" class="headerlink" title="執行腳本"></a>執行腳本</h3><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br><span class="line">33</span><br><span class="line">34</span><br><span class="line">35</span><br><span class="line">36</span><br><span class="line">37</span><br><span class="line">38</span><br><span class="line">39</span><br><span class="line">40</span><br><span class="line">41</span><br><span class="line">42</span><br><span class="line">43</span><br><span class="line">44</span><br><span class="line">45</span><br><span class="line">46</span><br><span class="line">47</span><br><span class="line">48</span><br></pre></td><td class="code"><pre><span class="line">./create_wasmedge_sysext.sh 0.14.1 wasmedge</span><br><span class="line"></span><br><span class="line"><span class="comment"># Output</span></span><br><span class="line">WasmEdge-0.14.1-Linux/lib/</span><br><span class="line">WasmEdge-0.14.1-Linux/lib/libwasmedge.so.0</span><br><span class="line">WasmEdge-0.14.1-Linux/lib/libwasmedge.so.0.1.0</span><br><span class="line">WasmEdge-0.14.1-Linux/lib/libwasmedge.so</span><br><span class="line">WasmEdge-0.14.1-Linux/bin/</span><br><span class="line">WasmEdge-0.14.1-Linux/bin/wasmedge</span><br><span class="line">WasmEdge-0.14.1-Linux/bin/wasmedgec</span><br><span class="line">WasmEdge-0.14.1-Linux/include/</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/version.h</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/enum_types.h</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/enum.inc</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/enum_errcode.h</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/wasmedge.h</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/enum_configure.h</span><br><span class="line">WasmEdge-0.14.1-Linux/include/wasmedge/int128.h</span><br><span class="line">Parallel mksquashfs: Using 8 processors</span><br><span class="line">Creating 4.0 filesystem on wasmedge.raw, block size 131072.</span><br><span class="line">[==================================================\] 654/654 100%</span><br><span class="line"></span><br><span class="line">Exportable Squashfs 4.0 filesystem, gzip compressed, data block size 131072</span><br><span class="line">compressed data, compressed metadata, compressed fragments,</span><br><span class="line">compressed xattrs, compressed ids</span><br><span class="line">duplicates are removed</span><br><span class="line">Filesystem size 25975.38 Kbytes (25.37 Mbytes)</span><br><span class="line">31.16% of uncompressed filesystem size (83364.33 Kbytes)</span><br><span class="line">Inode table size 1930 bytes (1.88 Kbytes)</span><br><span class="line">65.42% of uncompressed inode table size (2950 bytes)</span><br><span class="line">Directory table size 140 bytes (0.14 Kbytes)</span><br><span class="line">56.91% of uncompressed directory table size (246 bytes)</span><br><span class="line">Number of duplicate files found 0</span><br><span class="line">Number of inodes 10</span><br><span class="line">Number of files 3</span><br><span class="line">Number of fragments 1</span><br><span class="line">Number of symbolic links 2</span><br><span class="line">Number of device nodes 0</span><br><span class="line">Number of fifo nodes 0</span><br><span class="line">Number of socket nodes 0</span><br><span class="line">Number of directories 5</span><br><span class="line">Number of ids (unique uids + gids) 1</span><br><span class="line">Number of uids 1</span><br><span class="line">root (0)</span><br><span class="line">Number of gids 1</span><br><span class="line">root (0)</span><br><span class="line">Created wasmedge.raw</span><br></pre></td></tr></table></figure><p>最後會產生一個 <code>wasmedge.raw</code> 的 sysext 檔案。請把它放到一個能被存取到的網路空間,我們後續在部署的時候會用到它。</p><hr><h2 id="在-DigitalOcean-上新增-Flatcar-Container-Linux-的客製化映像檔案"><a href="#在-DigitalOcean-上新增-Flatcar-Container-Linux-的客製化映像檔案" class="headerlink" title="在 DigitalOcean 上新增 Flatcar Container Linux 的客製化映像檔案"></a>在 DigitalOcean 上新增 Flatcar Container Linux 的客製化映像檔案</h2><p>由於 DigitalOcean 並沒有提供 Flatcar Container Linux 的映像檔案,因此我們需要自己建立一個客製化的映像檔案。<br>這邊的流程是參考 <a href="https://www.flatcar.org/docs/latest/installing/cloud/digitalocean/">Flatcar 的官方文件</a>。</p><h3 id="下載你想部署的對應版本映像檔"><a href="#下載你想部署的對應版本映像檔" class="headerlink" title="下載你想部署的對應版本映像檔"></a>下載你想部署的對應版本映像檔</h3><p>Flatcar 官方的映像檔路徑為 <code> https://<channel>.release.flatcar-linux.net/<arch>-usr/<version>/flatcar_production_digitalocean_image.bin.bz2</code>,其中:</p><ul><li><code><arch> 為 </code>amd64<code>或</code>arm64`</li><li><code><channel></code> 為 <code>stable</code>、<code>beta</code>、<code>alpha</code> 或 <code>lts</code></li><li><code><version></code> 為你想要部署的版本號。</li></ul><p>舉例而言,如果你想要部署 <code>4081.2.0</code> 版本、架構為 <code>amd64</code> 、且屬於穩定頻道的映像檔,你可以透過以下指令下載:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">wget https://stable.release.flatcar-linux.net/amd64-usr/4081.2.0/flatcar_production_digitalocean_image.bin.bz2</span><br></pre></td></tr></table></figure><h3 id="上傳映像檔到-DigitalOcean"><a href="#上傳映像檔到-DigitalOcean" class="headerlink" title="上傳映像檔到 DigitalOcean"></a>上傳映像檔到 DigitalOcean</h3><p>雖然官方說可以透過命令行來上傳映像檔,但我嘗試後發現他會一直卡住,不如透過網頁上傳穩定。</p><h4 id="進入-Backups-Snapshots-頁面"><a href="#進入-Backups-Snapshots-頁面" class="headerlink" title="進入 Backups & Snapshots 頁面"></a>進入 Backups & Snapshots 頁面</h4><p>客製化映像檔的管理頁面被藏在 Backups & Snapshots 頁面裏,點選 Custom Images 的分頁。</p><img src="/2024/12/12/deploy-flatcar-on-do/do-0-custom-image.png" class=""><h4 id="上傳映像檔"><a href="#上傳映像檔" class="headerlink" title="上傳映像檔"></a>上傳映像檔</h4><p>點擊 Upload Image 按鈕,選擇你下載的映像檔後,會跳出以下的表單,請填寫相關的訊息。</p><img src="/2024/12/12/deploy-flatcar-on-do/do-1-upload-image-box.png" class=""><h4 id="上傳中"><a href="#上傳中" class="headerlink" title="上傳中"></a>上傳中</h4><p>上傳中,請耐心等待。</p><img src="/2024/12/12/deploy-flatcar-on-do/do-2-uploading-image.png" class=""><h4 id="上傳完成後,會進入等待中"><a href="#上傳完成後,會進入等待中" class="headerlink" title="上傳完成後,會進入等待中"></a>上傳完成後,會進入等待中</h4><img src="/2024/12/12/deploy-flatcar-on-do/do-3-image-pending.png" class=""><h4 id="等處理完成,會出現以下畫面"><a href="#等處理完成,會出現以下畫面" class="headerlink" title="等處理完成,會出現以下畫面"></a>等處理完成,會出現以下畫面</h4><p>至少有個區域可用,就能夠繼續後面的步驟囉。</p><img src="/2024/12/12/deploy-flatcar-on-do/do-4-image-ready.png" class=""><hr><h2 id="在-DigitalOcean-上部署-Flatcar-Container-Linux"><a href="#在-DigitalOcean-上部署-Flatcar-Container-Linux" class="headerlink" title="在 DigitalOcean 上部署 Flatcar Container Linux"></a>在 DigitalOcean 上部署 Flatcar Container Linux</h2><p>由於在網頁上的使用介面不直覺,後續步驟我們會透過 DigitalOcean 的 API 在命令行中來進行操作。</p><h3 id="前置動作"><a href="#前置動作" class="headerlink" title="前置動作"></a>前置動作</h3><p>在部署之前,有些環境變數需要先設定與取得。</p><h4 id="取得-Personal-Access-Token"><a href="#取得-Personal-Access-Token" class="headerlink" title="取得 Personal Access Token"></a>取得 Personal Access Token</h4><p>到 <a href="https://cloud.digitalocean.com/account/api/tokens">Personal Access Token</a> 頁面</p><img src="/2024/12/12/deploy-flatcar-on-do/do-5-pat-page.png" class=""><p>點選 Generate New Token 按鈕</p><img src="/2024/12/12/deploy-flatcar-on-do/do-6-pat.png" class=""><p>這邊請選擇要給該 Token 什麼權限,這邊恕我偷懶選了 Full Access,本文用完就撤銷了。</p><h4 id="將-Personal-Access-Token-設定到環境變數"><a href="#將-Personal-Access-Token-設定到環境變數" class="headerlink" title="將 Personal Access Token 設定到環境變數"></a>將 Personal Access Token 設定到環境變數</h4><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">read</span> TOKEN</span><br></pre></td></tr></table></figure><h4 id="設定-SSH-Key-ID"><a href="#設定-SSH-Key-ID" class="headerlink" title="設定 SSH Key ID"></a>設定 SSH Key ID</h4><p>請先使用網站上傳你的 SSH Key 或者使用 API 上傳 SSH Key,這個官方文件很詳細,請讓我跳過上傳的部分。</p><p>接著你需要取得已經上傳後的 SSH Key ID,這邊我們透過 API 來取得。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">curl --request GET <span class="string">"https://api.digitalocean.com/v2/account/keys"</span> \</span><br><span class="line"> --header <span class="string">"Authorization: Bearer <span class="variable">$TOKEN</span>"</span></span><br></pre></td></tr></table></figure><h5 id="SSH-Key-ID-在這邊"><a href="#SSH-Key-ID-在這邊" class="headerlink" title="SSH Key ID 在這邊"></a>SSH Key ID 在這邊</h5><p>上面指令執行以後,會得到以下的輸出,<code>id</code> 的那組數字就是我們後續會用到的 SSH Key ID。</p><figure class="highlight json"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br></pre></td><td class="code"><pre><span class="line"><span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"ssh_keys"</span><span class="punctuation">:</span><span class="punctuation">[</span></span><br><span class="line"> <span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"id"</span><span class="punctuation">:</span><span class="number">12345678</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"public_key"</span><span class="punctuation">:</span><span class="string">"ssh-ed25519 ..."</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"name"</span><span class="punctuation">:</span><span class="string">"..."</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"fingerprint"</span><span class="punctuation">:</span><span class="string">"..."</span></span><br><span class="line"> <span class="punctuation">}</span></span><br><span class="line"> <span class="punctuation">]</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"links"</span><span class="punctuation">:</span><span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"pages"</span><span class="punctuation">:</span><span class="punctuation">{</span></span><br><span class="line"></span><br><span class="line"> <span class="punctuation">}</span></span><br><span class="line"> <span class="punctuation">}</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"meta"</span><span class="punctuation">:</span><span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"total"</span><span class="punctuation">:</span><span class="number">1</span></span><br><span class="line"> <span class="punctuation">}</span></span><br><span class="line"><span class="punctuation">}</span></span><br></pre></td></tr></table></figure><h4 id="將-SSH-Key-ID-設定到環境變數"><a href="#將-SSH-Key-ID-設定到環境變數" class="headerlink" title="將 SSH Key ID 設定到環境變數"></a>將 SSH Key ID 設定到環境變數</h4><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">read</span> SSH_KEY_ID</span><br></pre></td></tr></table></figure><h4 id="取得映像檔-ID"><a href="#取得映像檔-ID" class="headerlink" title="取得映像檔 ID"></a>取得映像檔 ID</h4><p>在剛才上傳映像檔的時候,如果你有輸入 tag 的話,可以透過以下指令來取得映像檔 ID。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">curl --request GET <span class="string">"https://api.digitalocean.com/v2/images?type=distribution&private=true&tag_name=flatcar"</span> \</span><br><span class="line"> --header <span class="string">"Authorization: Bearer <span class="variable">$TOKEN</span>"</span></span><br></pre></td></tr></table></figure><h5 id="映像檔-ID-在這邊"><a href="#映像檔-ID-在這邊" class="headerlink" title="映像檔 ID 在這邊"></a>映像檔 ID 在這邊</h5><p>裏頭的 <code>id</code> 就是我們需要的資訊。</p><figure class="highlight json"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br></pre></td><td class="code"><pre><span class="line"><span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"images"</span><span class="punctuation">:</span><span class="punctuation">[</span></span><br><span class="line"> <span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"id"</span><span class="punctuation">:</span><span class="number">123456789</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"name"</span><span class="punctuation">:</span><span class="string">"flatcar_production_digitalocean_image.bin.bz2"</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"distribution"</span><span class="punctuation">:</span><span class="string">"Unknown OS"</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"slug"</span><span class="punctuation">:</span><span class="literal"><span class="keyword">null</span></span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"public"</span><span class="punctuation">:</span><span class="literal"><span class="keyword">false</span></span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"regions"</span><span class="punctuation">:</span><span class="punctuation">[</span></span><br><span class="line"> <span class="string">"nyc2"</span></span><br><span class="line"> <span class="punctuation">]</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"created_at"</span><span class="punctuation">:</span><span class="string">"2024-11-26T03:52:48Z"</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"min_disk_size"</span><span class="punctuation">:</span><span class="number">7</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"type"</span><span class="punctuation">:</span><span class="string">"custom"</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"size_gigabytes"</span><span class="punctuation">:</span><span class="number">0.66</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"description"</span><span class="punctuation">:</span><span class="string">""</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"tags"</span><span class="punctuation">:</span><span class="punctuation">[</span></span><br><span class="line"> <span class="string">"flatcar"</span></span><br><span class="line"> <span class="punctuation">]</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"status"</span><span class="punctuation">:</span><span class="string">"available"</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"error_message"</span><span class="punctuation">:</span><span class="string">""</span></span><br><span class="line"> <span class="punctuation">}</span></span><br><span class="line"> <span class="punctuation">]</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"links"</span><span class="punctuation">:</span><span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"pages"</span><span class="punctuation">:</span><span class="punctuation">{</span></span><br><span class="line"> <span class="punctuation">}</span></span><br><span class="line"> <span class="punctuation">}</span><span class="punctuation">,</span></span><br><span class="line"> <span class="attr">"meta"</span><span class="punctuation">:</span><span class="punctuation">{</span></span><br><span class="line"> <span class="attr">"total"</span><span class="punctuation">:</span><span class="number">1</span></span><br><span class="line"> <span class="punctuation">}</span></span><br><span class="line"><span class="punctuation">}</span></span><br></pre></td></tr></table></figure><h4 id="將映像檔-ID-設定到環境變數"><a href="#將映像檔-ID-設定到環境變數" class="headerlink" title="將映像檔 ID 設定到環境變數"></a>將映像檔 ID 設定到環境變數</h4><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">read</span> IMAGE_ID</span><br></pre></td></tr></table></figure><hr><h3 id="建立-Droplet"><a href="#建立-Droplet" class="headerlink" title="建立 Droplet"></a>建立 Droplet</h3><h4 id="準備設定檔案"><a href="#準備設定檔案" class="headerlink" title="準備設定檔案"></a>準備設定檔案</h4><p>請注意,Flatcar Linux 是不可修改的,因此需要在部署之初就透過 Ignition 來進行初始化設定。</p><h5 id="WasmEdge-的範例"><a href="#WasmEdge-的範例" class="headerlink" title="WasmEdge 的範例"></a>WasmEdge 的範例</h5><h6 id="Ignition-的設定檔案"><a href="#Ignition-的設定檔案" class="headerlink" title="Ignition 的設定檔案"></a>Ignition 的設定檔案</h6><p>本質上就是把 wasmedge.raw 放入 <code>/opt/extensions/</code> 下,並且建立一個連結到 <code>/etc/extensions/wasmedge.raw</code>。</p><p>請將以下設定檔存入 <code>wasmedge.yaml</code> 中。</p><figure class="highlight yaml"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br></pre></td><td class="code"><pre><span class="line"><span class="attr">variant:</span> <span class="string">flatcar</span></span><br><span class="line"><span class="attr">version:</span> <span class="number">1.0</span><span class="number">.0</span></span><br><span class="line"><span class="attr">storage:</span></span><br><span class="line"> <span class="attr">files:</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">path:</span> <span class="string">/opt/extensions/wasmedge-0.14.1-x86-64.raw</span></span><br><span class="line"> <span class="attr">mode:</span> <span class="number">0420</span></span><br><span class="line"> <span class="attr">contents:</span></span><br><span class="line"> <span class="attr">source:</span> <span class="string">https://github.com/second-state/flatcar-sysext-bakery/releases/download/0.0.1/wasmedge-0.14.1-x86-64.raw</span></span><br><span class="line"> <span class="attr">links:</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">target:</span> <span class="string">/opt/extensions/wasmedge-0.14.1-x86-64.raw</span></span><br><span class="line"> <span class="attr">path:</span> <span class="string">/etc/extensions/wasmedge.raw</span></span><br><span class="line"> <span class="attr">hard:</span> <span class="literal">false</span></span><br></pre></td></tr></table></figure><h6 id="轉換成-JSON-格式"><a href="#轉換成-JSON-格式" class="headerlink" title="轉換成 JSON 格式"></a>轉換成 JSON 格式</h6><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">cat</span> wasmedge.yaml | docker run --<span class="built_in">rm</span> -i quay.io/coreos/butane:latest > wasmedge.json</span><br></pre></td></tr></table></figure><h4 id="部署-Droplet"><a href="#部署-Droplet" class="headerlink" title="部署 Droplet"></a>部署 Droplet</h4><p>我選了相對便宜的機器 <code>s-1vcpu-1gb</code>,你可以根據需求選擇適合的機器。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br></pre></td><td class="code"><pre><span class="line">curl --request POST <span class="string">"https://api.digitalocean.com/v2/droplets"</span> \</span><br><span class="line"> --header <span class="string">"Content-Type: application/json"</span> \</span><br><span class="line"> --header <span class="string">"Authorization: Bearer <span class="variable">$TOKEN</span>"</span> \</span><br><span class="line"> --data <span class="string">'{</span></span><br><span class="line"><span class="string"> "region":"nyc3",</span></span><br><span class="line"><span class="string"> "image":"'</span><span class="variable">$IMAGE_ID</span><span class="string">'",</span></span><br><span class="line"><span class="string"> "size":"s-1vcpu-1gb",</span></span><br><span class="line"><span class="string"> "name":"core-5",</span></span><br><span class="line"><span class="string"> "private_networking":true,</span></span><br><span class="line"><span class="string"> "ssh_keys":['</span><span class="variable">$SSH_KEY_ID</span><span class="string">'],</span></span><br><span class="line"><span class="string"> "user_data": "'</span><span class="string">"<span class="subst">$(cat wasmedge.json | sed 's/<span class="string">"/\\"</span>/g')</span>"</span><span class="string">'"</span></span><br><span class="line"><span class="string">}'</span></span><br></pre></td></tr></table></figure><h5 id="登入-Droplet"><a href="#登入-Droplet" class="headerlink" title="登入 Droplet"></a>登入 Droplet</h5><p>來到 Droplet 的頁面就能看到一台已經部署好的 Droplet 正在努力工作中。</p><img src="/2024/12/12/deploy-flatcar-on-do/do-droplet-0.png" class=""><p>使用 SSH 連線到 Droplet。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">ssh core@<Droplet IP></span><br><span class="line"><span class="comment"># 在本範例中 ssh core@143.198.9.81</span></span><br></pre></td></tr></table></figure><h6 id="檢查是否有安裝成功"><a href="#檢查是否有安裝成功" class="headerlink" title="檢查是否有安裝成功"></a>檢查是否有安裝成功</h6><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br></pre></td><td class="code"><pre><span class="line">Flatcar Container Linux by Kinvolk stable 4081.2.0 <span class="keyword">for</span> DigitalOcean</span><br><span class="line">core@core-5 ~ $ wasmedge --version</span><br><span class="line">wasmedge version 0.14.1</span><br><span class="line"> (plugin <span class="string">"wasi_logging"</span>) version 0.1.0.0</span><br></pre></td></tr></table></figure><h6 id="執行-Wasm-應用程式"><a href="#執行-Wasm-應用程式" class="headerlink" title="執行 Wasm 應用程式"></a>執行 Wasm 應用程式</h6><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">wget https://github.com/second-state/flatcar-sysext-bakery/releases/download/0.0.3/hello_world.wasm</span><br><span class="line">wasmedge hello_world.wasm</span><br></pre></td></tr></table></figure><p>預期的輸出:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">Greeting from WasmEdge (=゚ω゚)ノ</span><br></pre></td></tr></table></figure><p>到此就完成部署囉!</p><hr><h5 id="LlamaEdge-的範例"><a href="#LlamaEdge-的範例" class="headerlink" title="LlamaEdge 的範例"></a>LlamaEdge 的範例</h5><h6 id="Ignition-的設定檔案-1"><a href="#Ignition-的設定檔案-1" class="headerlink" title="Ignition 的設定檔案"></a>Ignition 的設定檔案</h6><p>LlamaEdge 是執行在 WasmEdge 之上的,因此需要跟前一個範例一樣,先設定 WasmEdge</p><p>把 wasmedge.raw 放入 <code>/opt/extensions/</code> 下,並且建立一個連結到 <code>/etc/extensions/wasmedge.raw</code>。<br>接著再把 llamaedge.raw 放入 <code>/opt/extensions/</code> 下,並且建立一個連結到 <code>/etc/extensions/llamaedge.raw</code>。</p><p>請將以下設定檔存入 <code>llamaedge.yaml</code> 中。</p><figure class="highlight yaml"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br></pre></td><td class="code"><pre><span class="line"><span class="attr">variant:</span> <span class="string">flatcar</span></span><br><span class="line"><span class="attr">version:</span> <span class="number">1.0</span><span class="number">.0</span></span><br><span class="line"><span class="attr">storage:</span></span><br><span class="line"> <span class="attr">files:</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">path:</span> <span class="string">/opt/extensions/wasmedge-0.14.1-x86-64.raw</span></span><br><span class="line"> <span class="attr">mode:</span> <span class="number">0420</span></span><br><span class="line"> <span class="attr">contents:</span></span><br><span class="line"> <span class="attr">source:</span> <span class="string">https://github.com/second-state/flatcar-sysext-bakery/releases/download/0.0.3/wasmedge-0.14.1-x86-64.raw</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">path:</span> <span class="string">/opt/extensions/llamaedge-0.14.16-x86-64.raw</span></span><br><span class="line"> <span class="attr">mode:</span> <span class="number">0420</span></span><br><span class="line"> <span class="attr">contents:</span></span><br><span class="line"> <span class="attr">source:</span> <span class="string">https://github.com/second-state/flatcar-sysext-bakery/releases/download/0.0.3/llamaedge-0.14.16-x86-64.raw</span></span><br><span class="line"> <span class="attr">links:</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">target:</span> <span class="string">/opt/extensions/llamaedge-0.14.16-x86-64.raw</span></span><br><span class="line"> <span class="attr">path:</span> <span class="string">/etc/extensions/llamaedge.raw</span></span><br><span class="line"> <span class="attr">hard:</span> <span class="literal">false</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">target:</span> <span class="string">/opt/extensions/wasmedge-0.14.1-x86-64.raw</span></span><br><span class="line"> <span class="attr">path:</span> <span class="string">/etc/extensions/wasmedge.raw</span></span><br><span class="line"> <span class="attr">hard:</span> <span class="literal">false</span></span><br></pre></td></tr></table></figure><h6 id="轉換成-JSON-格式-1"><a href="#轉換成-JSON-格式-1" class="headerlink" title="轉換成 JSON 格式"></a>轉換成 JSON 格式</h6><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">cat</span> llamaedge.yaml | docker run --<span class="built_in">rm</span> -i quay.io/coreos/butane:latest > llamaedge.json</span><br></pre></td></tr></table></figure><h4 id="部署-Droplet-1"><a href="#部署-Droplet-1" class="headerlink" title="部署 Droplet"></a>部署 Droplet</h4><p>我選了相對便宜的機器 <code>s-1vcpu-1gb</code>,你可以根據需求選擇適合的機器。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br></pre></td><td class="code"><pre><span class="line">curl --request POST <span class="string">"https://api.digitalocean.com/v2/droplets"</span> \</span><br><span class="line"> --header <span class="string">"Content-Type: application/json"</span> \</span><br><span class="line"> --header <span class="string">"Authorization: Bearer <span class="variable">$TOKEN</span>"</span> \</span><br><span class="line"> --data <span class="string">'{</span></span><br><span class="line"><span class="string"> "region":"nyc3",</span></span><br><span class="line"><span class="string"> "image":"'</span><span class="variable">$IMAGE_ID</span><span class="string">'",</span></span><br><span class="line"><span class="string"> "size":"s-1vcpu-1gb",</span></span><br><span class="line"><span class="string"> "name":"core-5",</span></span><br><span class="line"><span class="string"> "private_networking":true,</span></span><br><span class="line"><span class="string"> "ssh_keys":['</span><span class="variable">$SSH_KEY_ID</span><span class="string">'],</span></span><br><span class="line"><span class="string"> "user_data": "'</span><span class="string">"<span class="subst">$(cat llamaedge.json | sed 's/<span class="string">"/\\"</span>/g')</span>"</span><span class="string">'"</span></span><br><span class="line"><span class="string">}'</span></span><br></pre></td></tr></table></figure><h5 id="登入-Droplet-1"><a href="#登入-Droplet-1" class="headerlink" title="登入 Droplet"></a>登入 Droplet</h5><p>使用 SSH 連線到 Droplet。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">ssh core@<Droplet IP></span><br><span class="line"><span class="comment"># 在本範例中 ssh core@45.55.53.176</span></span><br></pre></td></tr></table></figure><h6 id="下載模型"><a href="#下載模型" class="headerlink" title="下載模型"></a>下載模型</h6><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">wget https://huggingface.co/second-state/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q2_K.gguf</span><br></pre></td></tr></table></figure><h6 id="啟動-LlamaEdge-API-Server"><a href="#啟動-LlamaEdge-API-Server" class="headerlink" title="啟動 LlamaEdge API Server"></a>啟動 LlamaEdge API Server</h6><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br></pre></td><td class="code"><pre><span class="line">MODEL_FILE=<span class="string">"Llama-3.2-1B-Instruct-Q2_K.gguf"</span></span><br><span class="line">API_SERVER_WASM=<span class="string">"/usr/lib/wasmedge/wasm/llama-api-server.wasm"</span></span><br><span class="line">PROMPT_TEMPLATE=<span class="string">"llama-3-chat"</span></span><br><span class="line">CONTEXT_SIZE=128</span><br><span class="line">MODEL_NAME=<span class="string">"llama-3.2-1B"</span></span><br><span class="line"></span><br><span class="line"><span class="built_in">nohup</span> wasmedge \</span><br><span class="line"> --<span class="built_in">dir</span> .:. \</span><br><span class="line"> --nn-preload default:GGML:AUTO:<span class="variable">${MODEL_FILE}</span> \</span><br><span class="line"> <span class="variable">${API_SERVER_WASM}</span> \</span><br><span class="line"> --prompt-template <span class="variable">${PROMPT_TEMPLATE}</span> \</span><br><span class="line"> --ctx-size <span class="variable">${CONTEXT_SIZE}</span> \</span><br><span class="line"> --model-name <span class="variable">${MODEL_NAME}</span> &</span><br></pre></td></tr></table></figure><h6 id="送-Request-給-LlamaEdge-API-Server"><a href="#送-Request-給-LlamaEdge-API-Server" class="headerlink" title="送 Request 給 LlamaEdge API Server"></a>送 Request 給 LlamaEdge API Server</h6><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br></pre></td><td class="code"><pre><span class="line">curl -X POST http://localhost:8080/v1/chat/completions \</span><br><span class="line"> -H <span class="string">'accept:application/json'</span> \</span><br><span class="line"> -H <span class="string">'Content-Type: application/json'</span> \</span><br><span class="line"> -d <span class="string">'{"messages":[{"role":"system", "content": "You are a helpful assistant. Reply in short sentence"}, {"role":"user", "content": "What is the capital of Japan?"}], "model":"llama-3.2-1B"}'</span></span><br></pre></td></tr></table></figure><p>預期的輸出:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br></pre></td><td class="code"><pre><span class="line">{</span><br><span class="line"> <span class="string">"id"</span>:<span class="string">"chatcmpl-4a5a6487-3a47-4a79-86aa-0dd381dd71f7"</span>,</span><br><span class="line"> <span class="string">"object"</span>:<span class="string">"chat.completion"</span>,</span><br><span class="line"> <span class="string">"created"</span>:1734020512,</span><br><span class="line"> <span class="string">"model"</span>:<span class="string">"llama-3.2-1B"</span>,</span><br><span class="line"> <span class="string">"choices"</span>:[</span><br><span class="line"> {</span><br><span class="line"> <span class="string">"index"</span>:0,</span><br><span class="line"> <span class="string">"message"</span>:{</span><br><span class="line"> <span class="string">"content"</span>:<span class="string">"The capital of Japan is Tokyo."</span>,</span><br><span class="line"> <span class="string">"role"</span>:<span class="string">"assistant"</span></span><br><span class="line"> },</span><br><span class="line"> <span class="string">"finish_reason"</span>:<span class="string">"stop"</span>,</span><br><span class="line"> <span class="string">"logprobs"</span>:null</span><br><span class="line"> }</span><br><span class="line"> ],</span><br><span class="line"> <span class="string">"usage"</span>:{</span><br><span class="line"> <span class="string">"prompt_tokens"</span>:32,</span><br><span class="line"> <span class="string">"completion_tokens"</span>:9,</span><br><span class="line"> <span class="string">"total_tokens"</span>:41</span><br><span class="line"> }</span><br><span class="line">}</span><br></pre></td></tr></table></figure><p>若對 LlamaEdge 有更多興趣,請直接到該專案的 GitHub 頁面查看更多資訊。</p><p>謝謝大家的收看 :-)</p><h2 id="寫在最後"><a href="#寫在最後" class="headerlink" title="寫在最後"></a>寫在最後</h2><p>謝謝朋友的關心,關於在 DigitalOcean 上發生的事情就是我想測試能不能也啟動 GPU 的 Droplet 來進行更有效能的 LLM 測試,可惜 DigitalOcean 拒絕了我。<br>我根據官方頁面申請了 GPU Droplet 的限額,然後下面是他們的回信,因此在上面的工作都完成後,我已經移除所有在 DO 上面的 Droplet 與客製化映像檔。<br>將來也不打算在上面部署其他的服務,因為很明顯地,DigitalOcean 不願服務我這種小客戶。Azure、GCP、AWS 都相對友善,就算是奈米客戶,也能獲得相關的資源。<br>在此,我個人強烈建議可以使用其他的雲端服務廠商,而不是 DigitalOcean。</p><p>以下為收到的原文,為避免該工作人員困擾,我將使用 XXX 取代人名:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br></pre></td><td class="code"><pre><span class="line">Hello,</span><br><span class="line"></span><br><span class="line">Greetings from DigitalOcean!</span><br><span class="line"></span><br><span class="line">Thanks for reaching out, my name is XXX from DigitalOcean Support.</span><br><span class="line"></span><br><span class="line">After reviewing your account and your request, we are unable to approve access</span><br><span class="line"> to GPU Droplets at this time for your team.</span><br><span class="line"></span><br><span class="line">We would be happy to consider your request for a higher resource limit after</span><br><span class="line"> you have additional billing history on our platform.</span><br><span class="line">To provide additional context to what that means, invoices generate on the</span><br><span class="line"> first day of every month and are based on the services consumed in the prior</span><br><span class="line"> billing cycle. As invoices are generated and successfully paid using the</span><br><span class="line"> payment method on file for your team, this generates billing history which</span><br><span class="line"> is what we use to determine eligibility for resource limit increases.</span><br><span class="line"></span><br><span class="line">Should you have any questions or require further assistance regarding this</span><br><span class="line"> request, please do not hesitate to reach out to us.</span><br><span class="line"></span><br><span class="line">Warm Regards,</span><br><span class="line"></span><br><span class="line">XXX</span><br><span class="line">Associate Customer Advocate</span><br><span class="line">DigitalOcean Support</span><br></pre></td></tr></table></figure>]]></content>
<summary type="html"><h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>之前在 Flatcar 的系統擴充元件提了<a href="https://github.com/flatcar/sysext-bakery/pulls?q=is:pr+author:hydai+is:closed">兩個 PRs</a>,讓 <a href="https://github.com/WasmEdge/WasmEdge/">WasmEdge</a> 與 <a href="https://github.com/LlamaEdge/LlamaEdge">LlamaEdge</a> 能以 sysext 的方式在 Flatcar 被部署時,一起被安裝在裡頭。因為跟 DigitalOcean 發生了一些事,因此決定留下筆記以後,準備遷移到 Azure 或其他友善的平台。</p>
<p>本文將涵蓋:</p>
<ol>
<li>如何建立屬於你的 Flatcar Sysext(系統擴充元件)。</li>
<li>如何在 DigitalOcean 上新增 Flatcar Container Linux 的客製化映像檔案。</li>
<li>如何在 DigitalOcean 上部署 Flatcar Container Linux。</li>
<li>如何執行 Wasm 的應用程式(利益揭露,我是 WasmEdge 的維護者)。</li>
<li>如何部署大語言模型(LLM)在 Flatcar Container Linux 上,使用 LlamaEdge 為例。(利益揭露,這個專案會編譯成 Wasm 格式,並執行在 WasmEdge 之上)。</li>
</ol></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="Flatcar" scheme="https://hyd.ai/tags/Flatcar/"/>
<category term="Linux" scheme="https://hyd.ai/tags/Linux/"/>
<category term="DigitalOcean" scheme="https://hyd.ai/tags/DigitalOcean/"/>
<category term="LLM" scheme="https://hyd.ai/tags/LLM/"/>
<category term="WasmEdge" scheme="https://hyd.ai/tags/WasmEdge/"/>
<category term="WebAssembly" scheme="https://hyd.ai/tags/WebAssembly/"/>
<category term="Wasm" scheme="https://hyd.ai/tags/Wasm/"/>
<category term="WASI-NN" scheme="https://hyd.ai/tags/WASI-NN/"/>
</entry>
<entry>
<title>遊戲用設備紀錄@2024</title>
<link href="https://hyd.ai/2024/12/08/game-console-2024/"/>
<id>https://hyd.ai/2024/12/08/game-console-2024/</id>
<published>2024-12-08T10:38:19.000Z</published>
<updated>2024-12-08T11:35:02.041Z</updated>
<content type="html"><![CDATA[<h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>這篇文章主力紀錄目前正在服役中的遊戲用設備,算是幫 2024 年的自己做個總結。<br>預計每年年末都會更新一次,以便未來回顧。</p><span id="more"></span><h2 id="全部設備"><a href="#全部設備" class="headerlink" title="全部設備"></a>全部設備</h2><ul><li>桌機<ul><li>CPU: AMD Ryzen 9 5900X (2020/11/5 首發排隊購入,AMD Yes!)</li><li>主機板: ASUS ROG Crosshair VIII Hero (WI-FI) (跟 5900X 一起買)</li><li>顯示卡: ASUS ROG STRIX RTX3080TI O12G GAMING (2022 年購入打遊戲)</li><li>記憶體: G.SKILL 焰光戟 32GBx2 DDR4-4000 (2022 年升級)</li><li>SSD1: Samsung 970 Pro 1TB (2020 年購入)</li><li>SSD2: Samsung 990 Pro 4TB (2024 年黑五史上新低價購入)</li><li>電源供應器: ASUS ROG 750W 金牌 全模 十年保 (買 3080TI 時送的)</li><li>機殼: ASUS TUF Gaming GT301 ATX (買 5900X 送的)</li><li>顯示器: Sony KD-55X9500G (2019 年購入)</li><li>鍵盤: Majestouch Convertible 3 Tenkeyless 黑軸</li><li>滑鼠: Logitech G502</li><li>耳機: Sony INZONE H9 / ATH-A900X</li><li>麥克風: Blue Yeti X</li><li>椅子: Herman Miller Aeron 全功能 (2015 年購入)</li></ul></li><li>家用主機<ul><li>Sony PS4 Pro (丟老家當 remote 連線主機)</li><li>Sony PS5</li><li>Nintendo Switch 初代</li><li>Nintendo Switch Lite</li><li>Nintendo Switch 動森版</li></ul></li><li>掌上型遊戲機<ul><li>Sony PS Vita</li><li>New Nintendo 3DS XL * 2</li><li>Apple iPhone 11 Pro</li><li>Apple iPhone 13 Pro</li><li>Apple iPhone 15 Pro (當前主力機)</li><li>Apple iPad Pro M1 11” (他寫第三代,但這個命名我已經不知道是哪一代了XD)</li><li>Sony PlayStation Portal</li><li>Steam Deck OLED</li></ul></li></ul><h2 id="設備用途"><a href="#設備用途" class="headerlink" title="設備用途"></a>設備用途</h2><p>上次跟朋友聊到我目前的遊戲用設備配置,被問到最多的就是上面那麼多設備到底在玩什麼遊戲,哪來那麼多時間玩遊戲。<br>其實很多時候我也不知道,很多時候我是因為「想要玩」所以就買了,有沒有「真的玩」就是另一回事了。<br>就如同 Steam 的遊戲庫,裡面有太多買的遊戲,但要說真的有玩多少,可能不到一成有完全破關。<br>有時候覺得很疲倦,我都花了錢買了遊戲,還要花時間玩遊戲,想想就累了。或許這就是電子陽痿吧。</p><p>以下把上面這些服役中的設備的用途都整理一下,或許能知道這些的價值在哪裡。</p><h3 id="突然想回味當年"><a href="#突然想回味當年" class="headerlink" title="突然想回味當年"></a>突然想回味當年</h3><p>有些遊戲並沒有移植到新平台上,想遊玩只能使用舊的機器。</p><ul><li>Sony PS Vita</li><li>New Nintendo 3DS XL * 2</li></ul><p>以上的兩款機器平時幾乎不會開,PSV 大概一年會玩一兩次,3DS 則是想到以前的寶可夢戰友們就會打開來回味。</p><h3 id="放置型遊戲"><a href="#放置型遊戲" class="headerlink" title="放置型遊戲"></a>放置型遊戲</h3><p>這類遊戲通常就是擺著讓他跑,看著就很療癒的手機遊戲。<br>退役的手機效能夠用,平常就放在桌上看都很愉快。</p><ul><li>Apple iPhone 11 Pro</li><li>Apple iPhone 13 Pro</li></ul><h3 id="出門在外-回老家"><a href="#出門在外-回老家" class="headerlink" title="出門在外 / 回老家"></a>出門在外 / 回老家</h3><p>以方便攜帶為主,每次長途通勤時會想帶著來消磨時間。</p><ul><li>Apple iPad Pro M1 11” (買前生產力,買後打遊戲)</li><li>Steam Deck OLED (各種獨立遊戲)</li><li>Nintendo Switch 動森版 (初代&Lite 現在各自是特殊遊戲專用機)</li></ul><h3 id="認真在家打遊戲"><a href="#認真在家打遊戲" class="headerlink" title="認真在家打遊戲"></a>認真在家打遊戲</h3><h4 id="PS-家族"><a href="#PS-家族" class="headerlink" title="PS 家族"></a>PS 家族</h4><p>我以前對於家用主機很嚮往,但一直沒有興趣購買,畢竟最初並沒有遇見獨佔的遊戲。<br>直到某一天魔物獵人世界/冰原的到來,不買不行。<br>於是入手了 PS4 Pro,也幫她加裝了 SSD,不然那個讀取等待時間,時光可貴,別揮霍。</p><p>後來疫情時代,PS5 也發售了,然而我最初並沒有想買,只是看到他飢餓行銷,不抽白不抽,頂多抽中了讓給朋友。<br>結果我抽中了,朋友也抽中了,於是 PS5 就在我家住了下來。<br>可惜 PS5 的獨佔遊戲太少,號稱 8K 也是夢裡相見,我基本上當他是台藍光播放器用。<br>過了這麼久 PS5 Pro 都出來了,那這台 PS5 現在是什麼用途呢?PS VR2 + DMM VR :-) 懂得就懂。</p><p>意外之喜是今年的 PlayStation Portal,這台機器是我今年的泡澡專用機,可以一邊泡澡,一邊打 PS5 的遊戲,而且還具備 PS 手把原生的震動與自適應板機。<br>我的劍星、FF7 Remake、宇宙機器人都在這上面玩過,真的是太舒服了。</p><h4 id="桌機"><a href="#桌機" class="headerlink" title="桌機"></a>桌機</h4><p>實際上最認真玩的還是桌機,老黃的 3080Ti 天下無敵,模擬飛行、COD、FF14 等各種大作都可以跑得很順。<br>更何況現在越來越多遊戲也不再是獨佔遊戲了,我正在將重心從家用主機搬移到桌機上。</p><h2 id="結語"><a href="#結語" class="headerlink" title="結語"></a>結語</h2><p>算是幫自己記錄一下,也讓自己知道到底有多少遊戲用設備。2025 年如果還要添購新設備之前,需小心謹慎,別讓衝動變成後悔。</p><p>給 2025 年的自己,到時候你又買了什麼新設備呢?</p>]]></content>
<summary type="html"><h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>這篇文章主力紀錄目前正在服役中的遊戲用設備,算是幫 2024 年的自己做個總結。<br>預計每年年末都會更新一次,以便未來回顧。</p></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="2024" scheme="https://hyd.ai/tags/2024/"/>
<category term="遊戲用設備" scheme="https://hyd.ai/tags/%E9%81%8A%E6%88%B2%E7%94%A8%E8%A8%AD%E5%82%99/"/>
</entry>
<entry>
<title>加速 macOS 的 Time Machine 備份速度</title>
<link href="https://hyd.ai/2024/11/24/time-machine-speed-up/"/>
<id>https://hyd.ai/2024/11/24/time-machine-speed-up/</id>
<published>2024-11-24T08:58:22.000Z</published>
<updated>2024-11-24T09:20:13.427Z</updated>
<content type="html"><![CDATA[<h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>macOS 的 Time Machine 是一個非常好用的備份工具,它曾經多次救援我人生的重要時刻。當時碩士論文寫到一半,macbook pro 直接壞掉,幸好有 Time Machine 的備份,讓我可以如期畢業。</p><p>但是隨著使用的容量越來越大,在備份的過程中,有時候會發現備份的速度變成異常緩慢,甚至出現要十幾個小時才能完成一次備份,尤其人在國外時深感危險。</p><p>在搜尋的時候發現其實世界上有一大堆人也有相同的問題,幾乎都有標準做法了<code>(/ω\)</code></p><span id="more"></span><h2 id="解決方法"><a href="#解決方法" class="headerlink" title="解決方法"></a>解決方法</h2><p>與其他的備份方式不同,Time Machine 有個隱藏的參數會將備份的速度限制在一個較低的數值,且相關的優先順序也是在較低的位置,因此如果不改變這個數值的話,Time Machine 的備份速度就會隨著積累的增加而變得越來越慢。</p><p>很遺憾的是,這個數值是隱藏的,因此我們需要透過終端機來修改這個數值。</p><p>打開終端機以後,輸入以下指令:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">sudo</span> sysctl debug.lowpri_throttle_enabled=0</span><br></pre></td></tr></table></figure><p>這樣一來,Time Machine 的備份速度就會恢復到正常的速度了。</p><h3 id="指令解釋"><a href="#指令解釋" class="headerlink" title="指令解釋"></a>指令解釋</h3><p><code>sudo</code> 是因為我們需要透過 <code>sysctl</code> 進行系統參數的修改,而這個指令是需要 root 權限才能使用的。</p><p><code>sysctl</code> 是一個在 Unix-like 的作業系統中專門用來讀取與修改內核參數的工具。</p><p><code>debug.lowpri_throttle_enabled</code> 是一個參數,當這個參數為 1 時,Time Machine 的備份速度會被限制在一個較低的數值,當這個參數為 0 時,Time Machine 的備份速度就不再受到此限制。</p><p><code>lowpri</code> 是 low priority 的縮寫,<code>throttle</code> 是節流閥的意思,表示限制流量,<code>enabled</code> 是啟用的意思,因此這個參數的意思就是啟用低優先順序的限制。</p><h2 id="恢復原來的設定"><a href="#恢復原來的設定" class="headerlink" title="恢復原來的設定"></a>恢復原來的設定</h2><p>上面的修改會在重新開機後被重置,因此即使忘記恢復原本的設定也沒關係,時間會解決一切。</p><p>若完成備份之後,還是想要馬上恢復成原本的設定,可以輸入以下指令:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">sudo</span> sysctl debug.lowpri_throttle_enabled=1</span><br></pre></td></tr></table></figure><h2 id="方便的別名"><a href="#方便的別名" class="headerlink" title="方便的別名"></a>方便的別名</h2><p>每次都需要輸入這麼長的指令也是很麻煩的,因此可以使用 <code>alias</code> 來建立一個別名,這樣就能用短短的指令做到同樣的事囉。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment"># 請在你的 .bashrc 或 .zshrc 中加入以下指令</span></span><br><span class="line"><span class="built_in">alias</span> tmspup=<span class="string">'sudo sysctl debug.lowpri_throttle_enabled=0'</span></span><br><span class="line"><span class="built_in">alias</span> tmspdown=<span class="string">'sudo sysctl debug.lowpri_throttle_enabled=1'</span></span><br><span class="line"><span class="comment"># 日後就能使用 tmspup 與 tmspdown 來快速的切換 Time Machine 的備份速度了</span></span><br></pre></td></tr></table></figure><h2 id="Reference"><a href="#Reference" class="headerlink" title="Reference"></a>Reference</h2><ul><li><a href="https://www.howtogeek.com/843598/how-to-speed-up-your-time-machine-backups/">How to Speed Up Your Time Machine Backups</a></li></ul>]]></content>
<summary type="html"><h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>macOS 的 Time Machine 是一個非常好用的備份工具,它曾經多次救援我人生的重要時刻。當時碩士論文寫到一半,macbook pro 直接壞掉,幸好有 Time Machine 的備份,讓我可以如期畢業。</p>
<p>但是隨著使用的容量越來越大,在備份的過程中,有時候會發現備份的速度變成異常緩慢,甚至出現要十幾個小時才能完成一次備份,尤其人在國外時深感危險。</p>
<p>在搜尋的時候發現其實世界上有一大堆人也有相同的問題,幾乎都有標準做法了<code>(/ω\)</code></p></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="macOS" scheme="https://hyd.ai/tags/macOS/"/>
<category term="Time Machine" scheme="https://hyd.ai/tags/Time-Machine/"/>
</entry>
<entry>
<title>蓋一個簡單的縮網址服務</title>
<link href="https://hyd.ai/2024/11/19/url-shortener/"/>
<id>https://hyd.ai/2024/11/19/url-shortener/</id>
<published>2024-11-19T05:06:22.000Z</published>
<updated>2024-11-19T05:51:22.934Z</updated>
<content type="html"><![CDATA[<h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>之前的縮網址服務我是使用 picsee 來做的,但在免費的方案中,並不支援 HTTPS,這個影響很大,如果在 URL 中存在 HTTPS 的話,轉址服務就會失效。</p><p>P.S. 因為我很不滿意 picsee 就不放上連結了,此外,我也非常不建議大家使用。</p><p>雖然有段時間可以使用 Cloudflare 的 Proxy 來讓他有 HTTPS 的樣子,可是某次更新後也失效了,而且我發現失效的時候是在演講前,因此當了盤子付了一千台幣買了一年的進階網址服務。</p><p>早知道這麼麻煩,當初應該在不緊急的時候先搬遷到 <a href="https://lihi.io/">lihi</a> 上面,畢竟都要付錢了,當然要找一家好一點的服務商。</p><p>然而,過了一段時間,我發現我其實不需要這麼多功能,只是想要一個簡單的縮網址服務而已,我並不需要追蹤成效,也不需要網址的統計,因為我唯一需要的功能就只是讓我的聽眾可以用短一點的網址拿到投影片或者是其他的資料。<br>在這時意外看到了 <a href="https://github.com/sitcon-tw/URL-Shortener">SITCON 的 URL-Shortener</a>,我發現這個服務就是我想要的。</p><p>在功能上:</p><ul><li>可以自訂網址</li><li>可以加上該網址的簡介</li><li>可以放圖片</li><li>全部都是 markdown 語法</li><li>可以用 GitHub workflow 來自動部署</li></ul><p>因此我就以他的專案為基礎,自己做了一個<a href="https://github.com/hydai/URL-Shortener">相似的縮網址服務</a></p><span id="more"></span><p>以下來說明一下整個建立的流程,如果有其他夥伴想架設自己的服務,可以參考。</p><h2 id="建立縮網址服務"><a href="#建立縮網址服務" class="headerlink" title="建立縮網址服務"></a>建立縮網址服務</h2><p>在這個專案中,我使用 Jekyll 來建立重導向的機制,並且使用 GitHub Pages 來部署。</p><h3 id="安裝-Jekyll"><a href="#安裝-Jekyll" class="headerlink" title="安裝 Jekyll"></a>安裝 Jekyll</h3><p>我的環境在 macOS 上,由於不想與 macOS 內建的 Ruby 衝突,因此我使用 chruby 與 ruby-install 來管理與安裝 ruby。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br></pre></td><td class="code"><pre><span class="line">brew install chruby ruby-install</span><br><span class="line">ruby-install ruby 3.3.6 <span class="comment"># 裝了當前最新的版本</span></span><br></pre></td></tr></table></figure><p>接著需要在環境裡將路徑加入到 <code>~/.bashrc</code> 或者 <code>~/.zshrc</code> 中。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">export</span> PATH=<span class="string">"<span class="variable">$HOME</span>/.rubies/ruby-3.3.6/bin:<span class="variable">$PATH</span>"</span> <span class="comment"># 請將 3.3.6 換成你安裝的版本</span></span><br></pre></td></tr></table></figure><p>接著安裝 Jekyll</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">gem install jekyll bundler</span><br></pre></td></tr></table></figure><h3 id="建立專案"><a href="#建立專案" class="headerlink" title="建立專案"></a>建立專案</h3><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">jekyll new url-shortener</span><br></pre></td></tr></table></figure><p>這時你的目錄應該有以下的檔案:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br></pre></td><td class="code"><pre><span class="line">├── 404.html</span><br><span class="line">├── Gemfile</span><br><span class="line">├── Gemfile.lock</span><br><span class="line">├── _config.yml</span><br><span class="line">├── _posts</span><br><span class="line">│ └── 2024-11-19-welcome-to-jekyll.markdown</span><br><span class="line">├── about.markdown</span><br><span class="line">└── index.markdown</span><br></pre></td></tr></table></figure><p>然而,作為一個短網址服務,實際上我們只需要重導向的機制,因此我們先把不需要的檔案刪掉。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">rm</span> -rf _posts about.markdown</span><br></pre></td></tr></table></figure><p>刪除完以後會剩這些:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br></pre></td><td class="code"><pre><span class="line">├── 404.html</span><br><span class="line">├── Gemfile</span><br><span class="line">├── Gemfile.lock</span><br><span class="line">├── _config.yml</span><br><span class="line">└── index.markdown</span><br></pre></td></tr></table></figure><h3 id="建立重導向的版型與配置"><a href="#建立重導向的版型與配置" class="headerlink" title="建立重導向的版型與配置"></a>建立重導向的版型與配置</h3><p>此時,我們需要建立兩個資料夾,一個是 <code>_layouts</code> 用來放置版型,另一個是 <code>_redirects</code> 用來放置重導向的配置。</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line"><span class="built_in">mkdir</span> _layouts _redirects</span><br></pre></td></tr></table></figure><h4 id="重導向的版型"><a href="#重導向的版型" class="headerlink" title="重導向的版型"></a>重導向的版型</h4><p>在 <code>_layouts</code> 裡面建立一個 <code>redirects.html</code> 的檔案,這個檔案是用來建立重導向的版型,這個版型可以隨意更改,是以 SITCON 的為範本,設定為 0.5 秒後轉到目標網址去。</p><figure class="highlight html"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br></pre></td><td class="code"><pre><span class="line"><span class="meta"><!DOCTYPE <span class="keyword">html</span>></span></span><br><span class="line"><span class="tag"><<span class="name">html</span> <span class="attr">lang</span>=<span class="string">"zh-Hant-TW"</span>></span></span><br><span class="line"><span class="tag"><<span class="name">head</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">charset</span>=<span class="string">"utf-8"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">title</span>></span>{{ page.title }}<span class="tag"></<span class="name">title</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">name</span>=<span class="string">"description"</span> <span class="attr">content</span>=<span class="string">"{{ page.description }}"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">name</span>=<span class="string">"author"</span> <span class="attr">content</span>=<span class="string">"hydai"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">property</span>=<span class="string">"og:title"</span> <span class="attr">content</span>=<span class="string">"{{ page.title }}"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">property</span>=<span class="string">"og:description"</span> <span class="attr">content</span>=<span class="string">"{{ page.description }}"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">property</span>=<span class="string">"og:image"</span> <span class="attr">content</span>=<span class="string">"{{ page.image }}"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">property</span>=<span class="string">"og:site_name"</span> <span class="attr">content</span>=<span class="string">"hydai"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">meta</span> <span class="attr">property</span>=<span class="string">"og:locale"</span> <span class="attr">content</span>=<span class="string">"zh_TW"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">script</span>></span><span class="language-javascript"></span></span><br><span class="line"><span class="language-javascript"> <span class="built_in">setTimeout</span>(<span class="keyword">function</span>(<span class="params"></span>) {</span></span><br><span class="line"><span class="language-javascript"> <span class="variable language_">window</span>.<span class="property">location</span>.<span class="property">href</span> = <span class="string">"{{ page.redirect_to }}"</span>;</span></span><br><span class="line"><span class="language-javascript"> }, <span class="number">500</span>);</span></span><br><span class="line"><span class="language-javascript"> </span><span class="tag"></<span class="name">script</span>></span></span><br><span class="line"><span class="tag"></<span class="name">head</span>></span></span><br><span class="line"><span class="tag"><<span class="name">body</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">p</span>></span>重新導向到 (redirect to) <span class="tag"><<span class="name">a</span> <span class="attr">href</span>=<span class="string">"{{ page.redirect_to }}"</span>></span>{{ page.redirect_to }}<span class="tag"></<span class="name">a</span>></span>⋯⋯<span class="tag"></<span class="name">p</span>></span></span><br><span class="line"><span class="tag"></<span class="name">body</span>></span></span><br><span class="line"><span class="tag"></<span class="name">html</span>></span></span><br></pre></td></tr></table></figure><h4 id="重導向的配置"><a href="#重導向的配置" class="headerlink" title="重導向的配置"></a>重導向的配置</h4><p>在 <code>_redirects</code> 裡面建立一個 <code>template.markdown</code> 的檔案,這個檔案是用來建立重導向的配置。</p><figure class="highlight markdown"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br></pre></td><td class="code"><pre><span class="line">---</span><br><span class="line">layout: redirects</span><br><span class="line">title: "自定義標題"</span><br><span class="line">description: "自定義描述"</span><br><span class="line"><span class="section">redirect<span class="emphasis">_to: "https://hyd.ai/"</span></span></span><br><span class="line"><span class="emphasis"><span class="section">---</span></span></span><br></pre></td></tr></table></figure><h4 id="最終的檔案架構"><a href="#最終的檔案架構" class="headerlink" title="最終的檔案架構"></a>最終的檔案架構</h4><p>到這邊就準備好了,整個專案的檔案架構應該是這樣:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br></pre></td><td class="code"><pre><span class="line">├── 404.html</span><br><span class="line">├── Gemfile</span><br><span class="line">├── Gemfile.lock</span><br><span class="line">├── _config.yml</span><br><span class="line">├── _layouts</span><br><span class="line">│ └── redirects.html</span><br><span class="line">├── _redirects</span><br><span class="line">│ └── template.markdown</span><br><span class="line">└── index.markdown</span><br></pre></td></tr></table></figure><h3 id="修改設定來提供重導向服務"><a href="#修改設定來提供重導向服務" class="headerlink" title="修改設定來提供重導向服務"></a>修改設定來提供重導向服務</h3><h4 id="index-markdown"><a href="#index-markdown" class="headerlink" title="index.markdown"></a>index.markdown</h4><p>若有人訪問了此服務網站的首頁,我們至少可以將它導向到其他地方,比如個人的部落格。</p><p>在 <code>index.markdown</code> 裡面加入以下的內容:</p><figure class="highlight markdown"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br></pre></td><td class="code"><pre><span class="line">---</span><br><span class="line">layout: redirects</span><br><span class="line">title: "hydai's blog"</span><br><span class="line">description: "hydaiの空想世界"</span><br><span class="line"><span class="section">redirect<span class="emphasis">_to: "https://hyd.ai/"</span></span></span><br><span class="line"><span class="emphasis"><span class="section">---</span></span></span><br></pre></td></tr></table></figure><h4 id="404-html"><a href="#404-html" class="headerlink" title="404.html"></a>404.html</h4><p>若有人訪問了不存在的縮網址,我們可以透過 404.html 來讓他轉到首頁或者其他指定的地方,如個人的部落格。</p><p>在 <code>404.html</code> 裡面加入以下的內容:</p><figure class="highlight html"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br></pre></td><td class="code"><pre><span class="line">---</span><br><span class="line">permalink: /404.html</span><br><span class="line">layout: redirects</span><br><span class="line">title: "hydai's blog"</span><br><span class="line">description: "hydaiの空想世界"</span><br><span class="line">redirect_to: "https://hyd.ai/"</span><br><span class="line">---</span><br><span class="line"></span><br><span class="line"><span class="tag"><<span class="name">style</span> <span class="attr">type</span>=<span class="string">"text/css"</span> <span class="attr">media</span>=<span class="string">"screen"</span>></span><span class="language-css"></span></span><br><span class="line"><span class="language-css"> <span class="selector-class">.container</span> {</span></span><br><span class="line"><span class="language-css"> <span class="attribute">margin</span>: <span class="number">10px</span> auto;</span></span><br><span class="line"><span class="language-css"> <span class="attribute">max-width</span>: <span class="number">600px</span>;</span></span><br><span class="line"><span class="language-css"> <span class="attribute">text-align</span>: center;</span></span><br><span class="line"><span class="language-css"> }</span></span><br><span class="line"><span class="language-css"> <span class="selector-tag">h1</span> {</span></span><br><span class="line"><span class="language-css"> <span class="attribute">margin</span>: <span class="number">30px</span> <span class="number">0</span>;</span></span><br><span class="line"><span class="language-css"> <span class="attribute">font-size</span>: <span class="number">4em</span>;</span></span><br><span class="line"><span class="language-css"> <span class="attribute">line-height</span>: <span class="number">1</span>;</span></span><br><span class="line"><span class="language-css"> <span class="attribute">letter-spacing</span>: -<span class="number">1px</span>;</span></span><br><span class="line"><span class="language-css"> }</span></span><br><span class="line"><span class="language-css"></span><span class="tag"></<span class="name">style</span>></span></span><br><span class="line"></span><br><span class="line"><span class="tag"><<span class="name">div</span> <span class="attr">class</span>=<span class="string">"container"</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">h1</span>></span>404<span class="tag"></<span class="name">h1</span>></span></span><br><span class="line"></span><br><span class="line"> <span class="tag"><<span class="name">p</span>></span><span class="tag"><<span class="name">strong</span>></span>Page not found :(<span class="tag"></<span class="name">strong</span>></span><span class="tag"></<span class="name">p</span>></span></span><br><span class="line"> <span class="tag"><<span class="name">p</span>></span>The requested page could not be found.<span class="tag"></<span class="name">p</span>></span></span><br><span class="line"><span class="tag"></<span class="name">div</span>></span></span><br></pre></td></tr></table></figure><h4 id="清理不需要的相依性"><a href="#清理不需要的相依性" class="headerlink" title="清理不需要的相依性"></a>清理不需要的相依性</h4><p>由於這個縮網址服務不需要任何的主題與擴充元件,因此我們可以將 Gemfile 裡面用不到的相依性清除,如 <code>minima</code>, <code>jekyll-feed</code> 等。</p><p>以下是我最後的 Gemfile:</p><figure class="highlight plaintext"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br></pre></td><td class="code"><pre><span class="line">source "https://rubygems.org"</span><br><span class="line"># Hello! This is where you manage which Jekyll version is used to run.</span><br><span class="line"># When you want to use a different version, change it below, save the</span><br><span class="line"># file and run `bundle install`. Run Jekyll with `bundle exec`, like so:</span><br><span class="line">#</span><br><span class="line"># bundle exec jekyll serve</span><br><span class="line">#</span><br><span class="line"># This will help ensure the proper Jekyll version is running.</span><br><span class="line"># Happy Jekylling!</span><br><span class="line">gem "jekyll", "~> 4.3.4"</span><br><span class="line"># If you want to use GitHub Pages, remove the "gem "jekyll"" above and</span><br><span class="line"># uncomment the line below. To upgrade, run `bundle update github-pages`.</span><br><span class="line"># gem "github-pages", group: :jekyll_plugins</span><br><span class="line">#</span><br><span class="line"># Windows and JRuby does not include zoneinfo files, so bundle the tzinfo-data gem</span><br><span class="line"># and associated library.</span><br><span class="line">platforms :mingw, :x64_mingw, :mswin, :jruby do</span><br><span class="line"> gem "tzinfo", ">= 1", "< 3"</span><br><span class="line"> gem "tzinfo-data"</span><br><span class="line">end</span><br><span class="line"></span><br><span class="line"># Performance-booster for watching directories on Windows</span><br><span class="line">gem "wdm", "~> 0.1", :platforms => [:mingw, :x64_mingw, :mswin]</span><br><span class="line"></span><br><span class="line"># Lock `http_parser.rb` gem to `v0.6.x` on JRuby builds since newer versions of the gem</span><br><span class="line"># do not have a Java counterpart.</span><br><span class="line">gem "http_parser.rb", "~> 0.6.0", :platforms => [:jruby]</span><br></pre></td></tr></table></figure><h4 id="修改-config-yml"><a href="#修改-config-yml" class="headerlink" title="修改 _config.yml"></a>修改 <code>_config.yml</code></h4><p>把基本的資訊都寫入 <code>_config.yml</code> 中,並且加上重導向相關的機制:</p><figure class="highlight yaml"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br></pre></td><td class="code"><pre><span class="line"><span class="attr">title:</span> <span class="string">URL</span> <span class="string">shortener</span></span><br><span class="line"><span class="attr">email:</span> <span class="string">hydai@hyd.ai</span></span><br><span class="line"><span class="attr">description:</span> <span class="string">>-</span> <span class="comment"># this means to ignore newlines until "baseurl:"</span></span><br><span class="line"> <span class="string">URL</span> <span class="string">shortener</span> <span class="string">for</span> <span class="string">hyd.ai</span></span><br><span class="line"><span class="attr">baseurl:</span> <span class="string">""</span> <span class="comment"># the subpath of your site, e.g. /blog</span></span><br><span class="line"><span class="attr">url:</span> <span class="string">"https://hyd.ai"</span> <span class="comment"># the base hostname & protocol for your site, e.g. http://example.com</span></span><br><span class="line"><span class="attr">twitter_username:</span> <span class="string">hydai_tw</span></span><br><span class="line"><span class="attr">github_username:</span> <span class="string">hydai</span></span><br><span class="line"></span><br><span class="line"><span class="attr">collections:</span></span><br><span class="line"> <span class="attr">redirects:</span></span><br><span class="line"> <span class="attr">output:</span> <span class="literal">true</span></span><br><span class="line"> <span class="attr">permalink:</span> <span class="string">/:path/</span></span><br></pre></td></tr></table></figure><p>到此為止就是整個縮網址服務的建立流程,接下來就是部署的部分。</p><h3 id="部署到-GitHub-Pages"><a href="#部署到-GitHub-Pages" class="headerlink" title="部署到 GitHub Pages"></a>部署到 GitHub Pages</h3><p>在這個專案中,新增一個 workflows 的檔案 <code>jekyll.yml</code>,你可以取不同的名字,他將用來處理我們部署到 GitHub Pages 所用。</p><figure class="highlight yaml"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br><span class="line">33</span><br><span class="line">34</span><br><span class="line">35</span><br><span class="line">36</span><br><span class="line">37</span><br><span class="line">38</span><br><span class="line">39</span><br><span class="line">40</span><br><span class="line">41</span><br><span class="line">42</span><br><span class="line">43</span><br><span class="line">44</span><br><span class="line">45</span><br><span class="line">46</span><br><span class="line">47</span><br><span class="line">48</span><br><span class="line">49</span><br><span class="line">50</span><br><span class="line">51</span><br><span class="line">52</span><br><span class="line">53</span><br><span class="line">54</span><br><span class="line">55</span><br><span class="line">56</span><br><span class="line">57</span><br><span class="line">58</span><br><span class="line">59</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment"># 給個名字</span></span><br><span class="line"><span class="attr">name:</span> <span class="string">Deploy</span> <span class="string">Jekyll</span> <span class="string">site</span> <span class="string">to</span> <span class="string">Pages</span></span><br><span class="line"></span><br><span class="line"><span class="attr">on:</span></span><br><span class="line"> <span class="comment"># Runs on pushes targeting the default branch</span></span><br><span class="line"> <span class="comment"># 只有在 commits 被推到 main 分支時才會觸發</span></span><br><span class="line"> <span class="attr">push:</span></span><br><span class="line"> <span class="attr">branches:</span> [<span class="string">"main"</span>]</span><br><span class="line"></span><br><span class="line"> <span class="comment"># 允許你手動觸發這個 workflow</span></span><br><span class="line"> <span class="attr">workflow_dispatch:</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 設定 GITHUB_TOKEN 的權限,讓他可以部署到 GitHub Pages</span></span><br><span class="line"><span class="attr">permissions:</span></span><br><span class="line"> <span class="attr">contents:</span> <span class="string">read</span></span><br><span class="line"> <span class="attr">pages:</span> <span class="string">write</span></span><br><span class="line"> <span class="attr">id-token:</span> <span class="string">write</span></span><br><span class="line"></span><br><span class="line"><span class="comment"># 如果有多個部署在排隊,我們只會保留最新的來執行,但不會取消正在進行的部署</span></span><br><span class="line"><span class="attr">concurrency:</span></span><br><span class="line"> <span class="attr">group:</span> <span class="string">"pages"</span></span><br><span class="line"> <span class="attr">cancel-in-progress:</span> <span class="literal">false</span></span><br><span class="line"></span><br><span class="line"><span class="attr">jobs:</span></span><br><span class="line"> <span class="attr">build:</span></span><br><span class="line"> <span class="attr">runs-on:</span> <span class="string">ubuntu-latest</span></span><br><span class="line"> <span class="attr">steps:</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">name:</span> <span class="string">Checkout</span></span><br><span class="line"> <span class="comment"># 將程式碼 checkout 到 runner 上</span></span><br><span class="line"> <span class="attr">uses:</span> <span class="string">actions/checkout@v4</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">name:</span> <span class="string">Setup</span> <span class="string">Ruby</span></span><br><span class="line"> <span class="comment"># 設定 Ruby 環境</span></span><br><span class="line"> <span class="attr">uses:</span> <span class="string">ruby/setup-ruby@v1</span></span><br><span class="line"> <span class="attr">with:</span></span><br><span class="line"> <span class="attr">ruby-version:</span> <span class="string">'3.3'</span></span><br><span class="line"> <span class="attr">bundler-cache:</span> <span class="literal">true</span></span><br><span class="line"> <span class="attr">cache-version:</span> <span class="number">0</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">name:</span> <span class="string">Setup</span> <span class="string">Pages</span></span><br><span class="line"> <span class="attr">id:</span> <span class="string">pages</span></span><br><span class="line"> <span class="attr">uses:</span> <span class="string">actions/configure-pages@v5</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">name:</span> <span class="string">Build</span> <span class="string">with</span> <span class="string">Jekyll</span></span><br><span class="line"> <span class="comment"># 預設會輸出到 './_site' 目錄</span></span><br><span class="line"> <span class="attr">run:</span> <span class="string">bundle</span> <span class="string">exec</span> <span class="string">jekyll</span> <span class="string">build</span> <span class="string">--baseurl</span> <span class="string">"$<span class="template-variable">{{ steps.pages.outputs.base_path }}</span>"</span></span><br><span class="line"> <span class="attr">env:</span></span><br><span class="line"> <span class="attr">JEKYLL_ENV:</span> <span class="string">production</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">name:</span> <span class="string">Upload</span> <span class="string">artifact</span></span><br><span class="line"> <span class="comment"># 預設會上傳 './_site' 目錄</span></span><br><span class="line"> <span class="attr">uses:</span> <span class="string">actions/upload-pages-artifact@v3</span></span><br><span class="line"></span><br><span class="line"> <span class="attr">deploy:</span></span><br><span class="line"> <span class="attr">environment:</span></span><br><span class="line"> <span class="attr">name:</span> <span class="string">github-pages</span></span><br><span class="line"> <span class="attr">url:</span> <span class="string">${{</span> <span class="string">steps.deployment.outputs.page_url</span> <span class="string">}}</span></span><br><span class="line"> <span class="attr">runs-on:</span> <span class="string">ubuntu-latest</span></span><br><span class="line"> <span class="attr">needs:</span> <span class="string">build</span></span><br><span class="line"> <span class="attr">steps:</span></span><br><span class="line"> <span class="bullet">-</span> <span class="attr">name:</span> <span class="string">Deploy</span> <span class="string">to</span> <span class="string">GitHub</span> <span class="string">Pages</span></span><br><span class="line"> <span class="attr">id:</span> <span class="string">deployment</span></span><br><span class="line"> <span class="attr">uses:</span> <span class="string">actions/deploy-pages@v4</span></span><br></pre></td></tr></table></figure><h3 id="在-GitHub-上啟用-Pages"><a href="#在-GitHub-上啟用-Pages" class="headerlink" title="在 GitHub 上啟用 Pages"></a>在 GitHub 上啟用 Pages</h3><p>要注意的是,當將上面的 workflow 啟用後,如果以前在該 repo 上並沒有啟用過 GitHub Pages 的話,你需要到 <code>Settings</code> -> <code>Pages</code> 中啟用 GitHub Pages。</p><p>因為我們是使用 GitHub Pages 的服務,因此在 <code>Build and deployment</code> -> <code>Source</code> 中選擇 <code>GitHub Actions</code> 即可。</p><h3 id="使用自己的網域"><a href="#使用自己的網域" class="headerlink" title="使用自己的網域"></a>使用自己的網域</h3><p>如果你有自己的網域,同樣在 <code>Settings</code> -> <code>Pages</code> 中的 <code>Custom domain</code> 欄位把自己的子網域放上去。</p><h2 id="結語"><a href="#結語" class="headerlink" title="結語"></a>結語</h2><p>這個縮網址服務的建立流程其實非常簡單,只要有一點點的 HTML 與 markdown 的基礎,就可以輕鬆的建立一個屬於自己的縮網址服務。過程間也不需要去理解 Jekyll 的運作原理,只要知道怎麼建立版型與配置就可以了。</p>]]></content>
<summary type="html"><h2 id="前言"><a href="#前言" class="headerlink" title="前言"></a>前言</h2><p>之前的縮網址服務我是使用 picsee 來做的,但在免費的方案中,並不支援 HTTPS,這個影響很大,如果在 URL 中存在 HTTPS 的話,轉址服務就會失效。</p>
<p>P.S. 因為我很不滿意 picsee 就不放上連結了,此外,我也非常不建議大家使用。</p>
<p>雖然有段時間可以使用 Cloudflare 的 Proxy 來讓他有 HTTPS 的樣子,可是某次更新後也失效了,而且我發現失效的時候是在演講前,因此當了盤子付了一千台幣買了一年的進階網址服務。</p>
<p>早知道這麼麻煩,當初應該在不緊急的時候先搬遷到 <a href="https://lihi.io/">lihi</a> 上面,畢竟都要付錢了,當然要找一家好一點的服務商。</p>
<p>然而,過了一段時間,我發現我其實不需要這麼多功能,只是想要一個簡單的縮網址服務而已,我並不需要追蹤成效,也不需要網址的統計,因為我唯一需要的功能就只是讓我的聽眾可以用短一點的網址拿到投影片或者是其他的資料。<br>在這時意外看到了 <a href="https://github.com/sitcon-tw/URL-Shortener">SITCON 的 URL-Shortener</a>,我發現這個服務就是我想要的。</p>
<p>在功能上:</p>
<ul>
<li>可以自訂網址</li>
<li>可以加上該網址的簡介</li>
<li>可以放圖片</li>
<li>全部都是 markdown 語法</li>
<li>可以用 GitHub workflow 來自動部署</li>
</ul>
<p>因此我就以他的專案為基礎,自己做了一個<a href="https://github.com/hydai/URL-Shortener">相似的縮網址服務</a></p></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="URL Shortener" scheme="https://hyd.ai/tags/URL-Shortener/"/>
</entry>
<entry>
<title>每次升級 mac os 總會遇到的 xcrun error invalid active developer path, missing xcurn at ...</title>
<link href="https://hyd.ai/2016/10/01/xcurn-error-after-upgrade-10-12/"/>
<id>https://hyd.ai/2016/10/01/xcurn-error-after-upgrade-10-12/</id>
<published>2016-10-01T12:21:30.000Z</published>
<updated>2024-11-19T05:51:10.421Z</updated>
<content type="html"><![CDATA[<p>每次升級總會遇上一次的 xcrun error ,<br>為了不要每次都查資料,<br>來把它記錄一下吧!</p><span id="more"></span><p>上次遇到好像是從 10.10 升級到 10.11 的時候,<br>同樣的這次從 10.11 升級到 10.12 也是遇到一樣的問題。</p><p>每次升級完以後,xcode 跟要抓取的 lib 好像就會跑掉,<br>當我在跑 <code>brew update</code> 的時候就會噴出下面這種 error:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br></pre></td><td class="code"><pre><span class="line">$ brew update</span><br><span class="line">xcrun: error: invalid active developer path (/Library/Developer/CommandLineTools),</span><br><span class="line">missing xcrun at: /Library/Developer/CommandLineTools/usr/bin/xcrun)</span><br></pre></td></tr></table></figure><p>要解決的方法很簡單,看到上面的關鍵字 <code>CommandLineTools</code>,<br>我們就把 xcode 的 CommandLineTools 裝上來就能解掉這個問題囉~</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br></pre></td><td class="code"><pre><span class="line">xcode-select --install</span><br></pre></td></tr></table></figure><p>執行這行以後,他會問你要不要 <code>取得 xcode</code>,<br>如果你只是需要解掉這個問題,並沒有要使用 xcode 做開發的話,<br>直接選「安裝」即可,不安裝 xcode 的話,可以省下很多空間跟時間呢!</p>]]></content>
<summary type="html"><p>每次升級總會遇上一次的 xcrun error ,<br>為了不要每次都查資料,<br>來把它記錄一下吧!</p></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="MacOS" scheme="https://hyd.ai/tags/MacOS/"/>
<category term="xcurn" scheme="https://hyd.ai/tags/xcurn/"/>
</entry>
<entry>
<title>發 Pull Request 該注意的事情</title>
<link href="https://hyd.ai/2016/08/03/pullrequestnote/"/>
<id>https://hyd.ai/2016/08/03/pullrequestnote/</id>
<published>2016-08-03T14:06:22.000Z</published>
<updated>2024-11-19T03:00:10.630Z</updated>
<content type="html"><![CDATA[<h2 id="0-前言"><a href="#0-前言" class="headerlink" title="0. 前言"></a>0. 前言</h2><p>在漫長的程式旅途中,我們很容易就使用到 Free software(如 gcc) 以及<br>Open Source Project(如 Bootstrap) 。<br>身為用戶,我其實一直希望能為這些很棒的作品增加更多更棒的功能。</p><p>也許是身為工程師的浪漫吧,我很渴望能做出一款全世界都在用的軟體,<br>然後很自豪地說:「看呀,你在用的OOXX是我做的喔!」(燦笑)</p><p>而在這之前,我們可能更常碰到在使用 Open Source Project 的情況下遇到 Bug ,<br>或是想為其添加新功能。這時候,如果是在 GitHub 上頭,我們便會以 Pull Request<br>的方式把自己的修改發過去給 upstream 希望他們能把這樣的增進給 Merge 進去。</p><p>不過先別急著在改完就馬上送過去,不然可能很容易會被無視或是當成小白喔!</p><span id="more"></span><h2 id="1-至少要測試過才發-Pull-Request-過去喔"><a href="#1-至少要測試過才發-Pull-Request-過去喔" class="headerlink" title="1. 至少要測試過才發 Pull Request 過去喔"></a>1. 至少要測試過才發 Pull Request 過去喔</h2><p>千萬不要把沒有測試過的 Code 就直接發 Pull Request 塞過去,<br>這個一定會被巴死,而且人家會覺得你腦袋壞掉OTZ</p><p>現在很多的 Project 都會串上 CI 當你發送 Pull Request 過去時,<br>便會驅動 CI 進行測試,如果你的程式碼根本沒有通過測試,那麼 Project 的<br>maintainer 當然不會去幫你看囉。</p><p>也因此很容易就直接被忽略了。<br>而且這樣的行為可能會讓別人對你留下不是很好的印象喔!!</p><h2 id="2-符合人家的文化"><a href="#2-符合人家的文化" class="headerlink" title="2. 符合人家的文化"></a>2. 符合人家的文化</h2><p>正所謂家有家規、國有國法,當你想要為一個 Project 發送 Pull Request 的時候,<br>千萬要注意自己的 Coding Style 有沒有符合規範、命名的方式有沒有跟人家一致等等。</p><p>通常可以在人家的 Project 底下找到 CONTRIBUTING.md 或是 contributing guide ,<br>這裡面通常會記載很詳細的你要如何融入這個貢獻群的方式,<br>有些也會在這邊說明 Project 裡頭的 Coding Style 甚至是 Naming Style 喔!</p><p>如果沒有在 Project 中找到這些文化資訊的話,那麼就要利用自己 <em>敏銳</em> 的觀察力,<br>仔細的看一下該個 Project 裡面是怎麼樣的撰寫風格,然後盡量去符合 Code Owner 的<br>風格吧!</p><h2 id="3-要隨時跟想-merge-進去的-branch-保持同步"><a href="#3-要隨時跟想-merge-進去的-branch-保持同步" class="headerlink" title="3. 要隨時跟想 merge 進去的 branch 保持同步"></a>3. 要隨時跟想 merge 進去的 branch 保持同步</h2><p>比如說今天你在 fork 過來的 Project 從他的 <code>develop</code> branch 拉出來一條<br>叫做 <code>new_feature_A</code> 的 branch,準備為這個 Project 新增一個新功能 A。</p><p>在非常認真的撰寫程式碼後,你終於完成了這個新功能 A,並且想要把它貢獻回去原本的<br>Project 中,這時候你可能會想發一個 Pull Request(PR) 想把你的 <code>new_feature_A</code><br>branch merge 到原開發者的 <code>develop</code> 的 branch 上。</p><p>但是那條你原先拉出來的 <code>develop</code> branch 可能已經有了更多的 commit 或是<br>merge 了許多的 PR。<br>於是你的 branch 跟原開發者的 branch 就會發生 conflict,沒辦法被原開發者使用自動<br> merge,因為會被 conflict 的情況卡住。</p><p>當然這時候原開發者沒有義務幫你解決這個 conflict 的問題,如果你沒有自己處理好,<br>原開發者可能也會直接不理這個 PR 了。</p><p>因此隨時與自己想要貢獻回去的 branch 同步也是很重要的喔!</p><p>通常我個人的同步做法是這樣的:</p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br></pre></td><td class="code"><pre><span class="line">git fetch upstream</span><br><span class="line">git checkout new_feature_A <span class="comment"># 切到自己開發的那條 branch 上</span></span><br><span class="line">git rebase upstream/develop <span class="comment"># 跟原開發者的 develop branch 做 rebase</span></span><br></pre></td></tr></table></figure><p>不使用 git merge 跟 upstream 同步,而是用 rebase 的方式,<br>這樣可以保持整條 branch 的乾淨程度喔。</p><p>而到後來我自己很不喜歡看到 merge commit 這個點出現XD<br>畢竟有時候太多的 merge commit 實在會讓整個圖不太好看。</p><p>所以我自己會盡量少用 <code>git pull</code> 或是 <code>git merge</code> 這兩個指令的話,<br>而是改用 <code>git pull --rebase</code> 跟 <code>git rebase</code> 來取代喔!</p><h2 id="4-Squash-commits-的重要性"><a href="#4-Squash-commits-的重要性" class="headerlink" title="4. Squash commits 的重要性"></a>4. Squash commits 的重要性</h2><p>平常我們在開發的時候,很常進行 commit ,雖然只是要完成一件事情,<br>可是卻花了好幾個 commit 來完成這件事。最常見的是下面這個模樣:</p><figure class="highlight text"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br></pre></td><td class="code"><pre><span class="line">commit 1: Add feature meow</span><br><span class="line">commit 2: Fix typo</span><br><span class="line">commit 3: Fix missing OO</span><br><span class="line">commit 4: Fix bugs in feature meow</span><br><span class="line">...</span><br></pre></td></tr></table></figure><p>但是如果你要送上去的重點只有「Add feature meow」這件事情,我認為就應該只留下<br>表達出 「Add feature meow」 這件事的 commit log 就好,<br>這些細節(繁瑣)的過程留在 commit log 裡也沒有太大的好處。</p><p>這個時候我們就可以透過 <code>git rebase -i</code> 的方式來做 squash commits 讓我們的<br>commit log 變得更乾淨。</p><p>至於要怎麼使用 <code>git rebase</code> 來做 squash 呢?就留待之後的文章再來好好解釋啦!</p><p>P.S. 本文是以個人的經驗撰寫而成,<br>如果缺漏或需更正之處,歡迎各位提出更多的建議喔OuO。</p>]]></content>
<summary type="html"><h2 id="0-前言"><a href="#0-前言" class="headerlink" title="0. 前言"></a>0. 前言</h2><p>在漫長的程式旅途中,我們很容易就使用到 Free software(如 gcc) 以及<br>Open Source Project(如 Bootstrap) 。<br>身為用戶,我其實一直希望能為這些很棒的作品增加更多更棒的功能。</p>
<p>也許是身為工程師的浪漫吧,我很渴望能做出一款全世界都在用的軟體,<br>然後很自豪地說:「看呀,你在用的OOXX是我做的喔!」(燦笑)</p>
<p>而在這之前,我們可能更常碰到在使用 Open Source Project 的情況下遇到 Bug ,<br>或是想為其添加新功能。這時候,如果是在 GitHub 上頭,我們便會以 Pull Request<br>的方式把自己的修改發過去給 upstream 希望他們能把這樣的增進給 Merge 進去。</p>
<p>不過先別急著在改完就馬上送過去,不然可能很容易會被無視或是當成小白喔!</p></summary>
<category term="Note" scheme="https://hyd.ai/categories/Note/"/>
<category term="git" scheme="https://hyd.ai/tags/git/"/>
<category term="Pull Request" scheme="https://hyd.ai/tags/Pull-Request/"/>
<category term="PR" scheme="https://hyd.ai/tags/PR/"/>
</entry>
<entry>
<title>發現新玩具,用 Hexo 來寫網誌吧!</title>
<link href="https://hyd.ai/2016/05/22/helloworld/"/>
<id>https://hyd.ai/2016/05/22/helloworld/</id>
<published>2016-05-22T08:21:11.000Z</published>
<updated>2024-11-19T03:00:10.630Z</updated>
<content type="html"><![CDATA[<h1 id="準備更新網誌"><a href="#準備更新網誌" class="headerlink" title="準備更新網誌"></a>準備更新網誌</h1><p>長時間以來都是用 Google blogger 跟 logdown 兩個服務來寫網誌,<br>但是一個不吃 markdown 語法,只能用傳統的編輯,<br>另一個則是不是很穩,有時候會炸掉。</p><p>剛好發現這個工具 Hexo 可以用 markdown 來寫網誌,而且可以跟 GitHub Page 做結合,<br>那就來試用看看吧!</p><span id="more"></span><h1 id="一些渲染測試"><a href="#一些渲染測試" class="headerlink" title="一些渲染測試"></a>一些渲染測試</h1><h1 id="h1"><a href="#h1" class="headerlink" title="h1"></a>h1</h1><h2 id="h2"><a href="#h2" class="headerlink" title="h2"></a>h2</h2><h3 id="h3"><a href="#h3" class="headerlink" title="h3"></a>h3</h3><h4 id="h4"><a href="#h4" class="headerlink" title="h4"></a>h4</h4><p><strong>Bold</strong></p><p><em>Italy</em></p><p><em><strong>BI</strong></em></p><p><code>Pusheen</code></p><figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment"># bash</span></span><br><span class="line">$ <span class="built_in">echo</span> “Meow”</span><br><span class="line">$ <span class="built_in">cd</span> ~</span><br></pre></td></tr></table></figure><figure class="highlight cpp"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment">// cpp</span></span><br><span class="line"><span class="meta">#<span class="keyword">include</span> <span class="string"><iostream></span></span></span><br><span class="line"></span><br><span class="line"><span class="function"><span class="type">int</span> <span class="title">main</span><span class="params">()</span> </span>{</span><br><span class="line"> std::cout << “hello, world\n”;</span><br><span class="line"> <span class="keyword">for</span> (<span class="keyword">auto</span> i = <span class="number">0</span>; i < <span class="number">100</span>; i++) {</span><br><span class="line"> std::cout << i*<span class="number">100</span> << std::endl;</span><br><span class="line"> }</span><br><span class="line">}</span><br></pre></td></tr></table></figure><figure class="highlight python"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment"># python</span></span><br><span class="line">a = <span class="number">100</span></span><br><span class="line">b = <span class="number">1000</span></span><br><span class="line"><span class="keyword">if</span> a > <span class="number">100</span> <span class="keyword">and</span> b < <span class="number">50</span>:</span><br><span class="line"> <span class="built_in">print</span> (“Yes”)</span><br></pre></td></tr></table></figure>]]></content>
<summary type="html"><h1 id="準備更新網誌"><a href="#準備更新網誌" class="headerlink" title="準備更新網誌"></a>準備更新網誌</h1><p>長時間以來都是用 Google blogger 跟 logdown 兩個服務來寫網誌,<br>但是一個不吃 markdown 語法,只能用傳統的編輯,<br>另一個則是不是很穩,有時候會炸掉。</p>
<p>剛好發現這個工具 Hexo 可以用 markdown 來寫網誌,而且可以跟 GitHub Page 做結合,<br>那就來試用看看吧!</p></summary>
</entry>
</feed>