Integrating AI in Software Development and the Evolution of Java: Andrew Lombardi and Joseph Ottinger
In this interview, Andrew Lombardi, the head geek at Mystic, and Joseph Ottinger, a seasoned programmer, share their experiences and insights on various topics including the integration of generative AI in software development, significant advancements in the enterprise Java ecosystem, and the evolving landscape of IoT systems. They also discuss their new book, "Beginning Spring 6: From Beginner to Pro," strategies for staying ahead of emerging technologies, and advice for B2B marketers aiming to connect with software development leaders. Through their candid conversation, Andrew and Joe provide a comprehensive look at the current state and future trends of the tech industry.
InfoQ: Joe, Andrew, can you tell us a little bit about yourself and what you’re up to these days?
Andrew: Andrew here, I’m the head geek in charge at Mystic where we opened our doors in 2000 (fun year for tech). We’ve been in business for 24 years now and have helped companies as big as Walmart or as controversial as Twitter 1.0 succeed in their project goals. After a nearly 5 year engagement with Twitter we’ve stayed in flight helping companies succeed and deploying solutions using a variety of technologies and platforms including: Java (of course), Kotlin, React, Python and Rust. We’ve got some R&D projects with generative AI and doing plain ‘ole boring AI / machine learning for some clients as well.
Joe: And I’m Joe! I’m pretty much a meerkat of little brain or something, churning out content on a fairly regular basis. I’ve been programming for something like - gosh, how old am I? - over forty years now, starting with BASIC on a Timex/Sinclair 1000 and progressing through Pascal, COMAL, COBOL, xBase, the C-likes (C and C++), Cold Fusion, Perl, Python, and on into Java, Scala, and Kotlin, and probably a bunch I have forgotten or want to forget. I’ve focused a lot of energy on transfer of information throughout my career, trying to bend the envelope where I can and make sure others can see farther than I can, which is part of why I’ve tried to write for so long over my career.
InfoQ: What do you see as the main opportunities and challenges for integrating generative AI into software development, and how do you address the common misconceptions about AI potentially replacing human roles?
Andrew: The most significant opportunity is also the greatest challenge: How to take advantage of generative AI while maintaining quality and ensuring the written code solves the real problems. A lot of talk around AI involves it “taking your jobs” and while that is always the case for an emerging technology those who benefit the most are the ones who can adopt and use it as a multiplier in their role.
Joe: The biggest challenge there is, I think, awareness. Most people think of AI as this giant blob of… something. They don’t really know what it does, they just ask it a question and it responds with something that confirms their biases, for the most part, and thus it’s sort of like the abyss looking back at you - if it says the same thing you might have said, well, surely it can do what you do, thus it’s a competitor, right?
But that’s not the way it actually is - what it’s feeding back to you is what other people might have said themselves, so it’s really regurgitating a sort of ‘inferenced’ common knowledge, but that’s not the same as real knowledge or real innovation.
Humans are always going to be more flexible; the computer blends better and faster, but we’re better thinkers, and likely always will be. There is our opportunity: when it comes to AI, we can know better than the AI can about what the limits are, and how to focus and exploit them. AI is a better mousetrap for some kinds of problems, is all. It’s horribly useful, especially for those problems, and gosh, I rely on it! But without me knowing how to focus it, it’s not going to be able to replace me. [That’s what the AI told me to write, at least. Was it good?]
InfoQ: The enterprise Java development ecosystem has experienced notable changes and advancements in recent years. What would you say are some of the biggest developments and trends?
Joe: I think the biggest shifts have come from the JDK team itself. Coding on the JVM hadn’t changed all that much in years - people managed to shoehorn reactive and functional programming on the JVM with quite a bit of success, really, considering how little the JVM catered to those paradigms, but once Java 11 came out and we started seeing actual technical innovation in the JVM again, you saw real support for functional programming come into play, and with Java 21 and Loom, you see reactive programming really starting to come into its own without having to require such specialized code, for the most part. The JDK team isn’t stopping there: the native method invocation process is mutating, we’re likely to see native primitive types at last… just a lot of little things that we really wished Java to have for a decade or more are really starting to come into play.
How will that affect the library ecosystem? It’s hard to say. Right now, it’s mostly incidental improvement, because the ecosystem as a whole still has to support the older JVMs, but as that requirement lessens and people learn more about what the new JVM features are doing for them, we’ll see more mutations and better from the ecosystem as a whole. Or so I think!
InfoQ: Earlier this year, you and Andrew published a new book, “Beginning Spring 6: From Beginner to Pro” (By Apress). How are developers using Spring in 2024? What kinds of applications are they building?
Joe: I don’t think it’s really changed: people are still building complex variants of the PetStore, over and over and over again. The requirements are different - we’re now seeing security moved around and done in a lot of varying ways, you see a lot of messaging and data warehousing take place where people want to preserve application state so they can replay it at will, and so forth - but from ten thousand feet, it’s largely the same, with some of the library names being changed from place to place.
It’s a lot stronger now than it was - the days of having to roll up Struts actions are long gone, and you really don’t do WebMVC the way you used to either, unless you have fairly archaic requirements - people expect rich front ends now, and some of the performance enhancements you have at your disposal are different than they used to be. And deployment has exploded. Thankfully, there’s a huge body of knowledge out there to help - it’s not all the way there, yet, but that’s the way growth is; people start working with the raw materials and refine until there’s a standard practice.
InfoQ: Joe, you've recently been involved in designing and deploying services for Internet of Things (IoT) integration across various platforms. Could you share some details about your work in this field? Additionally, how are edge computing and AI influencing the development and deployment of IoT systems?
Joe: It’s changing the deployment model. IoT has really brought languages that have VMs to the forefront; we’re not really doing ladder programming or machine-level coding as much as we used to, especially for rapid development. You certainly can - I’d never sneer at someone using Rust for deployment on an IoT platform, for example, although I’d marvel at their investment in compilation times - but what I have seen in IoT is a focus on languages like Node, Python, and the JVM (even the JVM, which has much better performance on small devices than it did!) because they enable reliable cross-platform much better than they used to.
Of course, what we’re not seeing very much is a focus on security - everyone in IoT I know is aware of it, and is aware of the looming disaster that IoT security represents, but it’s a problem we’ve not really solved well yet. There are rumblings, and there have been for years; who knows, maybe Matter can help us solve it once and for all? But I still have my doubts.
As far as AI: AI on IoT is still an external service. None of the devices I’ve deployed on, even the “bigger devices,” have the horses to actually do any of the really fun AI models. They can do some of the inferencing and other simpler math-based models, but that’s not really what people think of these days when they think of AI: They think of conversational models, and those are expensive on those tiny boxes. Running a simple “Hi, how are you today” query… well… if I issued it on one of my better edge devices when I first picked up this interview, let’s just say that it would almost be ready to respond with its first word.
InfoQ: Andrew, as a company that has embraced remote work for 24 years, how have you adapted to the recent industry-wide acceptance and challenges of remote work since 2020? What strategies have you implemented to maintain team cohesion and productivity across multiple time zones?
Andrew: The changes we’ve seen in the industry since 2020 is an acceptance (and for some more recently a backlash) of knowledge workers being able to ditch the commute and work from wherever they felt comfortable. An investment in connectivity and ample hardware and desk setups in the home during the pandemic has paved the way for converts to the work from home lifestyle. The changes involve nurturing a team that spans multiple time zones and occasionally having conference calls at 4 a.m. The online tools are all available and have evolved enough to ensure your team's success.
InfoQ: As the industry evolves, what strategies do you employ to ensure you and your team stays ahead of the curve in terms of adopting emerging technologies and methodologies, and how can B2B marketers align their offerings with these evolving needs?
Andrew: Our team is always focused on helping our clients succeed in the most efficient and sustainable way possible. To that end, we encourage our team to contribute to open source projects and have hobbies which for engineers usually involves learning a random programming language “for fun”. The folks on our team have been in the industry usually for a decade or more, so emerging technologies and methodologies are taken with a grain of salt and evaluated for how it can help our clients. Marketers can align their offerings by providing a wealth of instructional material and guides which make adoption if it is appropriate for our client super simple.
InfoQ: When researching new technologies, what sites/publications do you typically visit?
Andrew: These days it ends up being found through searches with DuckDuckGo, or Perplexity and interrogating the other various large language models out there for the source material links back to Reddit or StackOverflow.
Joe: For me, I talk to anyone and everyone as if I’m a total neophyte. No judgment. If someone tells me about the new hotness, I don’t care if it’s a technology I’m familiar with or not, or even if it’s a deployment environment I use; if you mention, say, “Primate JS” and describe what it is and what it does, I want to be a constant learner, because you never know what really cool idea or tech is lurking behind someone’s raw passion. You never know how your perspective will change when someone else’s input is respected.
InfoQ: What advice would you give to B2B marketers looking to reach software development leaders?
Andrew: Tutorials on their products, show it in action, and write the examples in the most appropriate languages. If your product is hidden behind a sales lead generation form with no way to kick the tires or read more, you’ve already lost.
Joe: Holy cow, yes! Approach the product from a stance of ignorance. If someone who knows nothing about your product or service can’t figure it out in two minutes or less, you’ve lost them - even if they have to use your service, they’re never going to really buy in. Show, don’t tell. Fill in all the gaps. Assume they know little about how you think about problems.