Recently tried out the combination of Stitch integrated with AI Studio (Gemini), and I have to say the experience is truly impressive.
This is no longer traditional "natural language programming." The current process is like this—Stitch first generates high-quality UI design drafts, then AI Studio takes over to recognize these visual elements and directly output a complete code framework with one click. From the design draft to usable code, the entire pipeline is connected.
I tested it with a relatively simple product prototype, and the process was surprisingly smooth. The design drafts were well-generated, recognition accuracy was good, and the final code output had a clear structure that could be directly iterated on. This "visual language programming" approach is much more efficient compared to generating code from pure text instructions.
If this workflow continues to iterate in the future, the potential for development efficiency is indeed very large.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
7 Likes
Reward
7
6
Repost
Share
Comment
0/400
GasBandit
· 13h ago
Bro, this combination is really impressive. It integrates the entire code workflow seamlessly and can save so much manual effort.
View OriginalReply0
consensus_whisperer
· 14h ago
Really? Designing directly into code? Then wouldn't I be out of a job, haha?
View OriginalReply0
AirdropHunterXiao
· 14h ago
Damn, isn't this the rhythm where designers are about to lose their jobs?
But honestly, Stitch's combo this time is really impressive, turning design sketches directly into code frameworks—pretty awesome.
By the way, is the accuracy really that high? I feel like I still have to manually tweak quite a few details...
If this becomes stable, all those outsourced tasks will really have nothing to do.
Our programmers' jobs might also start to tremble...
But when it comes to efficiency, it does look really top-notch, and I'm a bit tempted.
View OriginalReply0
InscriptionGriller
· 14h ago
Huh, isn't this just industrializing the development process? Designing one-click transcoding sounds pretty good... But in practice, who knows when you'll end up with a piece of garbage code that looks like hallucination.
View OriginalReply0
SolidityStruggler
· 14h ago
Wow, this combination is really awesome. The design sketches instantly turn into usable code, boosting development efficiency!
View OriginalReply0
GasFeeAssassin
· 14h ago
Generate code directly from design sketches? If this really becomes stable, the development model will have to change.
Recently tried out the combination of Stitch integrated with AI Studio (Gemini), and I have to say the experience is truly impressive.
This is no longer traditional "natural language programming." The current process is like this—Stitch first generates high-quality UI design drafts, then AI Studio takes over to recognize these visual elements and directly output a complete code framework with one click. From the design draft to usable code, the entire pipeline is connected.
I tested it with a relatively simple product prototype, and the process was surprisingly smooth. The design drafts were well-generated, recognition accuracy was good, and the final code output had a clear structure that could be directly iterated on. This "visual language programming" approach is much more efficient compared to generating code from pure text instructions.
If this workflow continues to iterate in the future, the potential for development efficiency is indeed very large.