Jason Quense
|
c2ca1494e5
|
fix(opencode): preserve prompt tool enables with empty agent permissions (#17064)
Co-authored-by: jquense <jquense@ramp.com>
|
2026-03-15 23:01:46 -05:00 |
|
Dax
|
01d518708a
|
remove unnecessary deep clones from session loop and LLM stream (#14354)
|
2026-02-19 18:37:55 -06:00 |
|
Kyle Mistele
|
e269788a8f
|
feat: support claude agent SDK-style structured outputs in the OpenCode SDK (#8161)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Dax Raad <d@ironbay.co>
|
2026-02-12 04:54:05 +00:00 |
|
Aiden Cline
|
0fd6f365be
|
fix(core): ensure compaction is more reliable, add reserve token buffer to ensure that input window has enough room to compact (#12924)
Co-authored-by: James Lal <james@littlebearlabs.io>
|
2026-02-10 19:55:22 -06:00 |
|
Aiden Cline
|
ec720145fa
|
fix: when using codex sub, send the custom agent prompts as a separate developer message (previously sent as user message but api allows for instructions AND developer messages) (#11667)
Co-authored-by: Carlos <carloscanas942@gmail.com>
|
2026-02-01 14:57:47 -06:00 |
|
Steffen Deusch
|
d9f18e4006
|
feat(opencode): add copilot specific provider to properly handle copilot reasoning tokens (#8900)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Aiden Cline <63023139+rekram1-node@users.noreply.github.com>
Co-authored-by: Aiden Cline <aidenpcline@gmail.com>
|
2026-01-30 19:53:22 -06:00 |
|
Aiden Cline
|
11d486707c
|
fix: rm ai sdk middleware that was preventing <think> blocks from being sent back as assistant message content (#11270)
Co-authored-by: opencode-agent[bot] <opencode-agent[bot]@users.noreply.github.com>
|
2026-01-30 09:36:57 -05:00 |
|
Ravi Kumar
|
03ba49af4e
|
fix(telemetry): restore userId and sessionId metadata in experimental_telemetry (#8195)
|
2026-01-29 10:14:28 -06:00 |
|
Aiden Cline
|
29ea9fcf25
|
fix: ensure variants for copilot models work w/ maxTokens being set
|
2026-01-28 21:55:50 -06:00 |
|
ideallove
|
870c38a6aa
|
fix: maxOutputTokens was accidentally hardcoded to undefined (#10995)
|
2026-01-28 21:28:15 -06:00 |
|
Aiden Cline
|
ac53a372b0
|
feat: use anthropic compat messages api for anthropic models through copilot
|
2026-01-26 13:18:08 -05:00 |
|
Aiden Cline
|
94dd0a8dbe
|
ignore: rm spoof and bump plugin version
|
2026-01-25 17:27:24 -05:00 |
|
Daniel Rodriguez
|
6d574549bc
|
fix: include _noop tool in activeTools for LiteLLM proxy compatibility (#9912)
|
2026-01-21 18:46:41 -06:00 |
|
Aiden Cline
|
c89f6e7ac6
|
add chat.headers hook, adjust codex and copilot plugins to use it
|
2026-01-21 15:10:08 -06:00 |
|
Aiden Cline
|
14d1e20287
|
Revert "fix(app): support anthropic models on azure cognitive services" (#8966)
|
2026-01-16 15:25:23 -06:00 |
|
Unies Ananda Raja
|
b8e2895dfc
|
fix(app): support anthropic models on azure cognitive services (#8335)
|
2026-01-16 15:24:06 -06:00 |
|
Aiden Cline
|
d7192d6af9
|
tweak: set opencode as user agent for most interefence requests
|
2026-01-15 19:25:58 -06:00 |
|
seilk
|
9b57db30d1
|
feat: add litellmProxy provider option for explicit LiteLLM compatibility (#8658)
Co-authored-by: Mark Henderson <Mark.Henderson99@hotmail.com>
Co-authored-by: Aiden Cline <63023139+rekram1-node@users.noreply.github.com>
|
2026-01-15 22:01:15 +00:00 |
|
Aiden Cline
|
92931437c4
|
fix: codex id issue (#8605)
|
2026-01-15 01:31:50 -06:00 |
|
zerone0x
|
f9fcdead55
|
fix(session): skip duplicate system prompt for Codex OAuth sessions (#8357)
Co-authored-by: Claude <noreply@anthropic.com>
|
2026-01-13 23:44:39 -06:00 |
|
Spoon
|
4752c83155
|
feat: pass sessionID to chat.system.transform (#7718)
|
2026-01-10 18:21:51 -06:00 |
|
Aiden Cline
|
172bbdaced
|
feat: codex auth support (#7537)
|
2026-01-09 17:47:37 -06:00 |
|
Aiden Cline
|
2e4fe973c9
|
fix: issue w/ normal transform options conflicting w/ small model options when gen-ing title
|
2026-01-07 17:32:38 -06:00 |
|
Melih Mucuk
|
554572bc39
|
fix: prevent main model thinking variant from applying to small model (#6839)
Co-authored-by: Melih Mucuk <melih@monkeysteam.com>
|
2026-01-04 13:28:22 -06:00 |
|
Dax
|
351ddeed91
|
Permission rework (#6319)
Co-authored-by: Github Action <action@github.com>
Co-authored-by: Adam <2363879+adamdotdevin@users.noreply.github.com>
|
2026-01-01 17:54:11 -05:00 |
|
Aiden Cline
|
81fef60266
|
fix: ensure variants also work for completely custom models (#6481)
Co-authored-by: Daniel Smolsky <dannysmo@gmail.com>
|
2025-12-30 16:37:32 -06:00 |
|
Ytzhak
|
8f629db988
|
feat: add extract reasoning middleware (#6463)
|
2025-12-30 13:13:18 -06:00 |
|
Aiden Cline
|
ed0c0d90be
|
feat: add variants toggle (#6325)
Co-authored-by: Github Action <action@github.com>
|
2025-12-29 21:43:50 -06:00 |
|
Qio
|
1e4bfbcf6f
|
add OPENCODE_EXPERIMENTAL_OUTPUT_TOKEN_MAX to override 32k default (#5679)
Co-authored-by: qio <handsomehust@gmail.com>
|
2025-12-17 10:35:43 -06:00 |
|
Shantur Rathore
|
b8204c0bb7
|
fix: config option setCacheKey not being respected (#5686)
|
2025-12-17 10:20:10 -06:00 |
|
DS
|
72ebaeb8f7
|
fix: rejoin system prompt if experimental plugin hook triggers to preserve caching (#5550)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
|
2025-12-15 20:00:26 -06:00 |
|
Aiden Cline
|
81134cf61e
|
add ability to set topK
|
2025-12-15 14:34:56 -06:00 |
|
DS
|
b021b26e77
|
feat: restore experimental.chat.messages.transform and add experimental.chat.system.transform hooks (#5542)
|
2025-12-14 22:51:11 -06:00 |
|
Dax
|
fed4776451
|
LLM cleanup (#5462)
Co-authored-by: GitHub Action <action@github.com>
Co-authored-by: Aiden Cline <63023139+rekram1-node@users.noreply.github.com>
|
2025-12-14 21:11:30 -05:00 |
|