I'm afraid switching to it when you already have lots of code is pretty problematic even though I see in the documentation that it has an adapter for gorilla/mux
It's worth it. I was using swaggo before and that was pretty bad going from comments to structs and resolvers etc. but what i have now is beautiful, functional, and way easier to use.
Spec first with opai-codegens strict server interface. We do a little custom generation on top of that to support our modulith approach but it's working well
This 100%.
I don't know the tools being described here, but absolutely spec-first. It's the only way your spec stays accurate: generate a service layer and client layer from the spec, tools for every language exist, and templating languages too which let you control the generation.
CI should generate client and service, if it finds a `git diff` it errors and the dev needs to go regenerate and retest to validate they didn't break an endpoint by developing and testing a non-generated service or client layer.
Spec and service and client are to stay in sync through tooling, and best of all OpenAPI contracts and json schema are really robust so you can get all kinds of guards and verification coded automatically based on the json schema type definitions
Top down instead of Bottom Up; or to rephrase it: write Swagger/OpenAPI by hand and generate boilerplate from it, using https://github.com/oapi-codegen/oapi-codegen
Yep this is my approach. The output for the Echo framework integrated particularly nicely - I just have a codegen option in my `api` directory to keep things updated and exclude the generated file from git entirely.
grpc especially after switching my web to HATEOS with Templ. Now my API is significantly more stable as I don't have to constantly modify it to do frontend work.
I use grpc only for non-web clients (ex: providing a client library or a CLI). Both the http handlers for templ and the implementation of the grpc interfaces use a common service layer. and yes, I respond with HTML on my web handlers.
In an ideal world, your team would first design the API, write its Swagger, and then implement it.
In reality, we update the Swagger at the end of each iteration before release. (We use Apicurio Studio)
In my experience requirements change, clients change their mind, other teams are behind schedule, and unforseen challenges arise. None of these are good excuses, but after a while, we settled on updating the OpenAPI specifications as the last step (also, we're in the middle of a huge refactor. Hopefully, we reach a point where we write Swagger first).
grpc and grpc-gateway. We write all our APIs in protobuf. Comments there generates our Swagger docs. Compiled proto generates the stubs to write the server and http proxies. CLI clients can connect to the server directly using grpc. HTTP clients, including our UI (browser) connect to our server through the http proxy.
For reference:
Our protobuf definitions: [https://github.com/alphauslabs/blueapi](https://github.com/alphauslabs/blueapi)
Generated Swagger docs: [https://labs.alphaus.cloud/blueapidocs/#/](https://labs.alphaus.cloud/blueapidocs/#/)
grpc-gateway: [https://grpc-ecosystem.github.io/grpc-gateway/](https://grpc-ecosystem.github.io/grpc-gateway/)
We go the inverse direction and generate the skeleton handlers from the API docs (with homegrown code) before we implement the logic. There's no magic to either approach: keeping your docs and code in sync is one of those chores you have to be diligent about.
CI/CD checks can minimize the build up of warnings and other similar issues but otherwise, it's really a matter of doing the work despite it (usually) not being the most fun part of the job.
After looking around and being frustrated by what was available I made my own library to generate openapi documentation from structured endpoints.
I use it on a few personal projects and at my company.
https://github.com/schmurfy/chipi
My team uses annotations in code for both Laravel, Go projects. We stopped making many YAML files. I use GoLand, PHPStorm but don't see any warnings. And didn't get error via command line
docker exec -it --user gfly gfly-web swag init
or
swag init
[https://github.com/swaggo/swag](https://github.com/swaggo/swag)
we use huma. openapi spec is waay too verbose and hard to edit by hand.
yet we wanted a declarative approach. huma offers the best of both worlds.
we generate openapi spec on merge.
I was also using swaggo/swag and felt the same. You should check oapi-codegen, I know autogenerated code might not sound great but trust me, its really flexible
I built my swag commands into my pipeline.
https://github.com/opsdata-io/opsdata/blob/016931324abe91cce3e9ac5d8202e71885784995/.github/workflows/main.yml#L49
Honourable mention of Princess Beef Heavy Industries https://pb33f.io/ who have several Swagger and OpenAPI tools.
Used the liner to check our OpenAPI docs, and their Wiretap tool is great for checking responses match the requests.
I will be messaging you in 14 days on [**2024-07-12 23:07:27 UTC**](http://www.wolframalpha.com/input/?i=2024-07-12%2023:07:27%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/golang/comments/1dpzah2/how_do_you_generate_and_maintain_swagger/lar4xw1/?context=3)
[**1 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fgolang%2Fcomments%2F1dpzah2%2Fhow_do_you_generate_and_maintain_swagger%2Flar4xw1%2F%5D%0A%0ARemindMe%21%202024-07-12%2023%3A07%3A27%20UTC) to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201dpzah2)
*****
|[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)|
|-|-|-|-|
My team uses [kun](https://github.com/RussellLuo/kun), which allow you to define service specifications in native Go and automatically generate the OpenAPI Document/Protocol Buffers for you.
Disclaimer: I'm the author of this project:)
i use Goa, a DSL way to decalre spec and is able to generate both gRPC/REST endpoints
why not spec first? because it's easy to write DSL with the help of golang LSP, code autocomplete and ai copilot. why bother writing the spec all by hand or use some clunky ui
BTW, Goa can generate swagger file, so the spec is always synced with server code
Huma. Really can’t consider anything else since being aware of it
I'm afraid switching to it when you already have lots of code is pretty problematic even though I see in the documentation that it has an adapter for gorilla/mux
It's worth it. I was using swaggo before and that was pretty bad going from comments to structs and resolvers etc. but what i have now is beautiful, functional, and way easier to use.
I consider using it only for api spec, not for routing itself
Spec first with opai-codegens strict server interface. We do a little custom generation on top of that to support our modulith approach but it's working well
This 100%. I don't know the tools being described here, but absolutely spec-first. It's the only way your spec stays accurate: generate a service layer and client layer from the spec, tools for every language exist, and templating languages too which let you control the generation. CI should generate client and service, if it finds a `git diff` it errors and the dev needs to go regenerate and retest to validate they didn't break an endpoint by developing and testing a non-generated service or client layer. Spec and service and client are to stay in sync through tooling, and best of all OpenAPI contracts and json schema are really robust so you can get all kinds of guards and verification coded automatically based on the json schema type definitions
Top down instead of Bottom Up; or to rephrase it: write Swagger/OpenAPI by hand and generate boilerplate from it, using https://github.com/oapi-codegen/oapi-codegen
Yep this is my approach. The output for the Echo framework integrated particularly nicely - I just have a codegen option in my `api` directory to keep things updated and exclude the generated file from git entirely.
Maybe you want to share the authentication how you have integrated it? :)
That's typically the first middleware
grpc, grpc gateway, grpc ecosystem EDIT: also huma (but personally still prefer grpc)
grpc especially after switching my web to HATEOS with Templ. Now my API is significantly more stable as I don't have to constantly modify it to do frontend work.
Can you say more here? Are you using grpc on the front end or are you removing the need for types on the client because you are responding with HTML?
I use grpc only for non-web clients (ex: providing a client library or a CLI). Both the http handlers for templ and the implementation of the grpc interfaces use a common service layer. and yes, I respond with HTML on my web handlers.
I'm not using grpc for browser clients. Just nap the things (in the grpc schema) to REST and that's that
Got it. Thanks!
In an ideal world, your team would first design the API, write its Swagger, and then implement it. In reality, we update the Swagger at the end of each iteration before release. (We use Apicurio Studio)
Yeah - I sit down with FE and discuss upcoming changes, and the outcome of that meeting gets documented in swagger.
We also generate code (mostly model structs) from the swagger definition using openapi-generator.
Why is this ideal? Designing an API, instead of winging it, doesn't mean that you have to write Swagger first.
In my experience requirements change, clients change their mind, other teams are behind schedule, and unforseen challenges arise. None of these are good excuses, but after a while, we settled on updating the OpenAPI specifications as the last step (also, we're in the middle of a huge refactor. Hopefully, we reach a point where we write Swagger first).
grpc and grpc-gateway. We write all our APIs in protobuf. Comments there generates our Swagger docs. Compiled proto generates the stubs to write the server and http proxies. CLI clients can connect to the server directly using grpc. HTTP clients, including our UI (browser) connect to our server through the http proxy. For reference: Our protobuf definitions: [https://github.com/alphauslabs/blueapi](https://github.com/alphauslabs/blueapi) Generated Swagger docs: [https://labs.alphaus.cloud/blueapidocs/#/](https://labs.alphaus.cloud/blueapidocs/#/) grpc-gateway: [https://grpc-ecosystem.github.io/grpc-gateway/](https://grpc-ecosystem.github.io/grpc-gateway/)
We go the inverse direction and generate the skeleton handlers from the API docs (with homegrown code) before we implement the logic. There's no magic to either approach: keeping your docs and code in sync is one of those chores you have to be diligent about. CI/CD checks can minimize the build up of warnings and other similar issues but otherwise, it's really a matter of doing the work despite it (usually) not being the most fun part of the job.
By hand manually like a psychopath
After looking around and being frustrated by what was available I made my own library to generate openapi documentation from structured endpoints. I use it on a few personal projects and at my company. https://github.com/schmurfy/chipi
Love Huma.rocks! We of course dump the openAPI file and render a TS client from it. Makes developing just a breeze.
My team uses annotations in code for both Laravel, Go projects. We stopped making many YAML files. I use GoLand, PHPStorm but don't see any warnings. And didn't get error via command line docker exec -it --user gfly gfly-web swag init or swag init [https://github.com/swaggo/swag](https://github.com/swaggo/swag)
Huma
we use huma. openapi spec is waay too verbose and hard to edit by hand. yet we wanted a declarative approach. huma offers the best of both worlds. we generate openapi spec on merge.
I was also using swaggo/swag and felt the same. You should check oapi-codegen, I know autogenerated code might not sound great but trust me, its really flexible
I built my swag commands into my pipeline. https://github.com/opsdata-io/opsdata/blob/016931324abe91cce3e9ac5d8202e71885784995/.github/workflows/main.yml#L49
I write it manually. Split into multiple files and folders. Then use it to generate boilerplate code using openapi-codegen
goa
I have never been satisfied with the quality of any generated OpenAPI spec. I wrote s generator myself for a project, but I would not recommend that.
Honourable mention of Princess Beef Heavy Industries https://pb33f.io/ who have several Swagger and OpenAPI tools. Used the liner to check our OpenAPI docs, and their Wiretap tool is great for checking responses match the requests.
Typespec to generate OpenAPI spec. Can generate handlers from there. I hate handwriting yaml. Going to check out Huma though. Looks a little promising
I use this repo \[https://github.com/wI2L/fizz\], but it seems no longer maintained.
https://learning-cloud-native-go.github.io/docs/routes-and-openapi-specification/
I do not know what errors/warnings you mean. We use it for all of our projects and it works like charm. ( provided the user knows what is doing ).
RemindMe! 2 weeks
I will be messaging you in 14 days on [**2024-07-12 23:07:27 UTC**](http://www.wolframalpha.com/input/?i=2024-07-12%2023:07:27%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/golang/comments/1dpzah2/how_do_you_generate_and_maintain_swagger/lar4xw1/?context=3) [**1 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fgolang%2Fcomments%2F1dpzah2%2Fhow_do_you_generate_and_maintain_swagger%2Flar4xw1%2F%5D%0A%0ARemindMe%21%202024-07-12%2023%3A07%3A27%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201dpzah2) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|
My team uses [kun](https://github.com/RussellLuo/kun), which allow you to define service specifications in native Go and automatically generate the OpenAPI Document/Protocol Buffers for you. Disclaimer: I'm the author of this project:)
There's a project called Fuego which generates OpenAPI docs for your code based on type signatures.
i use Goa, a DSL way to decalre spec and is able to generate both gRPC/REST endpoints why not spec first? because it's easy to write DSL with the help of golang LSP, code autocomplete and ai copilot. why bother writing the spec all by hand or use some clunky ui BTW, Goa can generate swagger file, so the spec is always synced with server code
I would recommend using https://huma.rocks/ It has so many more features other than just auto generating docs.