Photonic computing questions answered: we get technical

Four months ago I interviewed Lightmatter CEO Nick Harris on his photonic supercomputer. There are now almost 400,000 views and 2,600 comments … with lots of questions. In this episode, Harris answers the biggest ones.

  • is this real?
  • does the chip do what Lightmatter says it will do?
  • RGB vs CMYK?
  • how do you do linear algebra with light?
  • where can I get this?
  • capacitance issues
  • can it run DOOM?
  • can it run Crysis?
  • why hasn’t NVIDIA done this already?
  • how does this interface with quantum computing?

… and much more.


Support TechFirst: Become a $SMRT stakeholder


Scroll for full video, to subscribe to the podcast, and a transcript …

Photonic computing: answers

(Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.)

Subscribe: the CEO of photonic computing startup Lightmatter answers your questions

 

Read: will photonic computers run Doom or Crysis?

(This transcript has been lightly edited for length and clarity.)

John Koetsier: Hello, and welcome to the very first episode of ‘Mean Tweets on TechFirst’ with Lightmatter CEO, Nick Harris. Of course, I am mostly joking.

Months ago, we had Nick on TechFirst and we chatted about his photonic supercomputer, using light to vastly accelerate computing performance. The video has almost 400,000 views on YouTube and 2,500 comments. And we thought it’d be kind of fun to go through the comments — some of the nice ones, some of the mean ones, some of the tough ones — and get Nick to answer them. So welcome, Nick! 

Nicholas Harris: Yeah, thanks for having me, John. Good to be back. 

John Koetsier: Hey, super happy to have you. You’ve got some guts … hope you’re ready for this. Maybe before we dive into all those tweets, give us a quick update on what Lightmatter’s photonic computer is for those who didn’t watch the first video. 

Nicholas Harris: Yeah, so Lightmatter is building a photonic compute chip. It’s targeted at running artificial intelligence algorithms at extremely high speeds and using very little energy. It’s a completely new type of computer and we’re really excited to bring it to market. 

John Koetsier: Excellent. And you recently had some new news, some pretty exciting news as well. 

Nicholas Harris: Yeah, so we raised an $80 million Series B round and the goal with that money is to fund go-to-market with our product — it’s called Envise, and it’s our first-generation photonic compute accelerator. We’ve been hiring a bunch of people on the business side with that money and really focused on bringing this thing to the world.

John Koetsier: Nice. And there’s some big names among those investors, including Google Ventures, correct? 

Nicholas Harris: Yeah, so Google Ventures — GV as they’re called now — participated in the round, along with all of our existing investors and some new ones, including Hewlett Packard Enterprise.

John Koetsier: Excellent. So let me share the original video here and we will just go through some of those tweets and see what we see. Here’s one that’s actually interesting from Know Art,

‘I’d be really interested to know how it actually works. I have some understanding of transistors and logic gates, but I can’t understand how you’d actually do linear algebra with light.’

What’s your answer, Nick? 

Nicholas Harris: Yeah, so we can understand some of the basic operations. Linear algebra, if you look at it as a class of problems, you’re doing multiplication and addition … that’s really it. So if you can do multiplication and addition, you can do linear algebra.

And how do you do it with light?

What we do is we use a device called a modulator — or what one can do, there are many ways to build photonic computers — and what a modulator does is incoming light comes into the modulator and it goes out the other side. You can actually apply a voltage or a current, an electrical signal to that modulator and attenuate the brightness of the light, for example. Or you can apply a phase shift, proportional to the signal that you’re applying to the modulator.

So really think about it like, if you want to do multiplication, you apply an electrical signal to a modulator and it imparts that data on the optical stream. And so let’s say you wanted to multiply two numbers. You could have two modulators in series, as an example, so one followed by another, connected directly to each other. You’d apply a value here that’s B and a value here that’s A. And the value at the end of this thing would be B times A. Addition can be done in sort of a similar manner.

One way that you could do it would be by taking a bunch of different wave guides, for example, and then putting them all into a single wave guide. So shining a bunch of lights into a single channel — this would give you addition. There are a lot of different ways to do addition, but there’s multiply and there’s add, hopefully it makes sense. It’s all about modulators. 

John Koetsier: Very, very interesting. By the way, we learned a lot about ourselves in the first video. We learned that I say ‘very interesting’ a lot, so we’re going to just repeat that and [laughs] if it is interesting, I’m going to say it. And apparently you say — what was your phrase? ‘Yeah, so’ — is that correct?

Nicholas Harris: Yeah, so.

John Koetsier: Okay. Yeah, so. Well, let’s go with Random Guy, ‘Will it run DOOM?’ 

Nicholas Harris: Will it run DOOM? You know, what we’re focused on right now is not graphics, but we know that linear algebra as a class of computation can be used for doing graphics processing. I see a future, not too far away, where Lightmatter could support things like raytracing and rendering and things like this. So … hopefully soon. I love gaming and would love to be able to play in that space. 

John Koetsier: I was going to say ‘very interesting.’ I was not going to say very interesting. I said very interesting.

Well, you know, hey, Max White said almost the same thing right here, ‘I think the top 2 things for me are gaming and rendering. You mentioned it’s good at raytracing which is perfect for both,’ so he said it’s ‘exciting.’ I think the reality here is though, that you’re probably not going to be able to find this in Best Buy, right? This is not probably where you’re going with this chip right away. 

Nicholas Harris: Yeah, we’re focusing on selling this to enterprises, to big businesses. I would love to see a future where consumers found use cases that were compelling from a business perspective, where maybe you’d be able to go to Best Buy and buy a Lightmatter card.

But for right now, we’re really targeting very large companies that are deploying artificial intelligence. Hopefully there are markets going forward where individuals might want to deploy this kind of hardware. 

John Koetsier: Exactly. And there might be some individuals in those very large companies who just want to play DOOM on them as well, so that may be coming.

This was very interesting from WeeHee and he brings up, ‘Now we can literally say that RGB improves performance’ which kicked off an entire debate, which I was unaware of. I mean, I know the GIF/JIF right? But I was not aware that there is a big internet debate over CMYK.

Which is … RGB is red, blue, green, and CMYK is cyan, magenta, yellow and black — K is the black. CMYK is much more from the print world, of course. RGB is from your monitors. Settle the debate for us … is it CMYK or RGB? 

Nicholas Harris: Well, if I had to choose between those, I would say RGB.

But I think what they’re talking about is if you look at gaming computers today, like if you go to Newegg or one of these websites that sells components for building gaming computers, almost all of them have lights; they’ll have LEDs where you can customize what color the fans and the computer are, or the mouse or the keyboard.

And so I think there, the joke, the way that I took it is we can actually process on multiple colors at the same time with Lightmatter’s technology and it actually makes it faster. Whereas it’s sort of like a boondoggle or a toy right now that you can have red, green or blue [on] a mouse. 

John Koetsier: Now I understand. Excellent. There’s another joke about ‘yeah, so’ … we’ll skip that one. Faustin says, ‘linear algebra is the math Swiss army knife of computer science,’ which is cool. And then there’s a bunch like these and they got quite a few responses as well, right?

‘We need to see proof that this hardware really does what it says it does in real-world situations. We hear a lot of hype. Heard from a lot of startups, it’s mostly talk.’ So we have James Andrew and Jack Wright here. ‘I’ll wait until they actually have something.’

I mean, on the one hand it’s really good because you just raised $80 million from Google and others that’s — they don’t throw $80 million away. I mean, it’s not like they get it right every time, but that’s a proof point. What do you say in response to people like that? 

Nicholas Harris: Yeah, so I can tell you a bit about the backstory of Lightmatter. So I did my doctorate and a postdoc at MIT, and while I was there, I built the first systems that did this kind of linear algebra with Silicon photonics, and that’s the platform that Lightmatter builds its technology on. So that technology has been published in Nature, Nature Physics, Nature Photonics, all the top journals in academia. It’s peer-reviewed. It works.

That’s how I got the first funding round from Matrix and Spark. At Lightmatter, we’ve built a series of chips. The way that we get the money is by showing the investors these things working. I know it sounds crazy, but it’s proven technology, 

John Koetsier: You know, I mean, if I’m investing $80 million, I’m going to be pretty bloody sure that I’ve had my hands on an actual piece of hardware, done some testing, had my geeks look at it and verify that it is correct.

John Koetsier: Here’s an interesting comment from Valteri — I won’t try the last name — commenting about this ‘General purpose AI accelerator’ comment that you’ve had. He says, ‘Is that what we’re calling matrix multiplication now? Still a big deal, of course, but I find marketing speak funny.’ Do you find marketing speak funny or is it not marketing speak?

Nicholas Harris: Yeah, so I think in a sense it is marketing speak.

The technology that we built accelerates linear algebra, but what we’re building at Lightmatter — as a system, as an entire chip and a solution — is an artificial intelligence accelerator that can be applied to any type of AI. So that’s what I mean by general purpose AI acceleration.

If we got rid of all the software stacks — our software stack is called Idiom — throw that away, throw away the ASIC architecture and some of the other features in the chip, then yeah, it’s just linear algebra. But there’s a ton of work and a ton of intellectual property that wraps around that thing that makes it into an AI accelerator rather than just a linear algebra chip.

John Koetsier: Nice. Nice. Excellent. So, somebody liked me … my questions are ‘top notch.’ That’s great. Somebody liked you as well. We’ll skim over that really, really quickly. There’s somebody else who is skeptical. But here’s a good question, and this is about the low-k dialectic problem.

John Savard says, ‘My understanding about photonic computers, they solve that problem. Light beams pass by each other side by side don’t cause capacitance problems slowing each other down.’ Is that the right way of thinking about it? Is there another way of thinking about it? How do you respond to that? 

Nicholas Harris: Okay, so there are many contributors to energy consumption in computers today. Capacitance as a category is one of the biggest. You can think of a capacitor as sort of a well that you have to dump electrons into and then once it fills up, you’re like, ready to go. So it’s this very annoying property. So that’s the general idea.

Now, where do you find capacitors? You find them in the wires that connect everything on chips and may slow the communications down a ton and dissipate a ton of power. You also find them in the transistor itself. So CMOS, a complementary metal oxide semiconductor, is built on this idea of the MOS cap — cap, being a capacitor — and dielectrics and their dielectric constant k modulate the performance of that transistor.

As transistors were shrunk, we had to find higher k dielectrics. They went to hafnium and there’s probably more exotic things at this point, but I’m not up to date on the materials. And yes, they’re challenging, but I don’t think as much — in the transistor world it’s not as much about the dielectric constant k as it is about the atomic thinness of that dielectric at this point.

So it’s an important piece. I think Lightmatter is solving the interconnect part where the wires are really taking a ton of energy because of their capacitance. And we’re also getting rid of the MOS cap that capacitor in doing the computation. And so that’s a huge chunk of energy that you get rid of. 

John Koetsier: Nice. Nice. So we have some others, a little more fun, ‘instead of thermal throttling we’ll have darkness throttling.’ That’s facetious, we won’t get into that. 

Nicholas Harris: That’s exactly correct. 

John Koetsier: Excellent. Somebody else wants more technical details. That is exactly what you’re doing today. That’s great.

Somebody here, hey, you’re hiring, they’re doing a PhD in Photonics. That’s wonderful. Somebody wants to run Crysis — same answer as DOOM. Excellent. Excellent. Somebody else who is skeptical. Great. The $80 million might help with that, but you’ve got some work to do.

This one is actually interesting and that is in Cyrillic, the names, so I won’t mention that, but ‘I guess this is the next company that Nvidia is going to buy.’ And what’s interesting there is I recently just released a video about HiPerGator AI, which is a new supercomputer at the University of Florida. It’s ranked 22nd globally for supercomputers, it’s AI-focused, and it’s built with NVIDIA chips.

Interesting thing is, and I don’t think the NVIDIA rep that was on the call wanted that to be said, but the provost of the University of Florida said that they spun it up to top speed just recently, it has a ton of cooling and air conditioning, and it took 1.1 megawatts of power to run … and it is a room-sized system.

This is an old-school big computer that you buy a big building and put in there. So I don’t know that NVIDIA is going to buy you and I don’t know if they want to, but comment a little bit on what supercomputers look like today and what they might look like as you reach maturity and ship. 

Nicholas Harris: Yeah, so you’ve seen a lot of press recently, like Intel announced Ponte Vecchio, and in their showing that new chip, they were showing the water cooling that’s absolutely required for it because it’s a 600 watt chip. So you’ve got water pipes running into the chips. You imagine a supercomputer built out of these kinds of things … there’s water everywhere.

At the same time, you’re seeing Microsoft talking about immersion cooling, where they literally take baths of apparently edible — feel free to look it up on YouTube — coolant. It’s like an organic coolant and they dip the cards in this, and then basically as they heat up they generate bubbles that cool the system. So you’re seeing a lot of water being used and a lot of liquid phase cooling. That’s all about heat, trying to get the heat out of the chips and making it so that they can still get reasonable compute density.

Because with just air, you’d really have to space things out a bunch.

So what does the data center look like with Lightmatter? You get rid of the water — you don’t need it — a lot higher compute density, more compute per energy footprint. It means that, you know, not that I’d want them to do this, but they could reproduce that system at the University of Florida using a lot less of our chips than NVIDIA’s and save power and money.

But we prefer to upgrade it so that it’s faster. 

John Koetsier: The plumber’s union will not be happy. [laughs] There’s a lot of plumbing that goes on and water that goes on and other things like that. Okay, back to the comments here, ‘awesome interview, very good questions, great answers.’ So there’s some kudos for you as well, that’s great.

Somebody wants to process local weather on it in real time. Maybe, who knows? I don’t know about local, and probably won’t be in your room. Another person is waiting for a shadow based processor. I’m sure it could be invented, there’s people who will do all kinds of crazy things, and 740 people liked that comment. I wanted to — ah, this was ‘kudos to the CEO for not selling it as a thing that will change everything as is often done with new technologies. He knows his strengths, isn’t afraid to admit the current shortcomings.’ That’s a great comment there.

Here’s one that’s interesting, let’s get into this one: this is from Shynamo, ‘Imagine having a PC with quantum processors as your CPU and light processors as your GPU. That would be cray fast.’ I’m guessing that’s crazy, but cray is kind of interesting given Cray supercomputers. ‘Hope we get there in a few more decades.’

Talk about that a little bit in terms of where you see — I know that your computer interfaces with standard computers quite well. In fact it’s built in, it’s working together. What about with quantum computers? 

Nicholas Harris: Yeah, so my PhD was focused on quantum computing. I think that it’s brilliant and it will change the world, but it’s going to take quite a while. So it’s going to be, in my opinion, something like 10, 20 years before there’s even an opportunity for a Lightmatter chip to be next to a quantum computer that’s at scale.

I think that quantum computers are really targeted at very specific applications. I would encourage anybody watching the video to go to nist.gov, they maintain a document that shows all of the different computational problems that quantum computers are known to speed up. And so this is vetted by scientists like legit people in the field, and there are papers that back up these results.

What you’ll find when you look at that list is like, it’s not going to run Crysis. So it’s not going to be your CPU. To be clear, it is turing complete, and turing complete means, yes, the quantum computer could run Crysis, but it would be an awful waste of that incredible hardware. You’d be better served doing database search with Grover’s algorithm or any number of other things, maybe even cracking some RSA with Shor’s algorithm.

John Koetsier: Very interesting. There I go, I said it again. We need some more ‘and so’s’ from you, just keep the peanut gallery happy here.

This is a very interesting comment, and this of course is a type of comment that your PR department warns you about and says, ‘Hey, don’t comment on that’ or something like that. So use your own judgment here, from SwedishDeath_lama — love your name — ‘At CES a decade ago, NVIDIA had a small demo of their own research into optical computing. Obviously no sales at that point, but the engineer was talking about it, one of the most interesting things forever. So happy to see this tech come to market.’

Talk about competition in the space. Why has NVIDIA not come out with something here? Why have others not come out with something here? Why has it taken a decade? And how are you guys so advanced in this particular area compared to others?

Nicholas Harris: Yeah, so first of all, I would say that we need competition. We need a lot of people working on this, similar to what Elon did with open sourcing the patents for Tesla. His goal there was to try to drive adoption of electric vehicles because he knew there was a mismatch between the tech that Tesla had and what other people had.

I sort of view our industry the same way: the problem we’re trying to solve is really big. We’re the first company that’s in this space. Since we were founded, there have been about nine other companies working in this. And yes, there have been press releases from Intel on them working on some form of optical computing. I’m not aware of NVIDIA doing anything on it, but I do know some of the executive team there, and let’s just say, lots of people are interested in this kind of field.

And I widely encouraged people to look into it. I think it’s important. 

John Koetsier: Excellent, excellent. I skimmed over a bunch while you were answering that. I love this ‘Star Trek nerd here going Oooooh Photonic.’ Yeah, a lot of us are like that. Madras61 understood it’s for ‘deep learning cloud computing data centers and big companies.’

This is an interesting comment from Christian Reinisch, ‘running a photon processor on different light waves is a real step towards 3D processing.’ Any comments about 3D processing? 

Nicholas Harris: Yeah, so what I’m taking that to mean is like our brain is a 3D computer chip, if you will. It’s not just like layers of neurons, they’re all connected in 3D in a volume. So, when you use different colors in a Lightmatter processor, what it does is it creates another virtual processor on top of your current one for the same resource cost of the base layer. Another color gives you another layer, and so you can kind of build up a whole like cube of compute that’s composed of the different colors of light passing through the core.

So yeah, maybe it’s like 3D. I could go really geeky and talk about the interconnect that would need to go between those sheets, but I won’t.

John Koetsier: Interesting. Uh, there, I did it again. Yup [laughs]. I can’t help it, if I think it’s interesting, I say it. There’s somebody who wants a ‘yeah, so’ counter in the top right, so maybe we’ll do a version like that; another one with ‘interesting’ as well. Maybe two counters, there we go.

This is a great comment from Daniel Lee, ‘My ex-wife invited some friends over for dinner in ’96, one of them was an MIT grad student. He said at the time they were a couple of decades out.’ Wow, he seemed pretty close. That is a great comment. Here’s one from ThePhotovoltaicMan — great name — ‘How many different colors could you run through a core? Every nanometer of wavelength? Every 12, 20, 50, etc.?’ I’m going to like that comment, because it’s a great one. What’s your answer? 

Nicholas Harris: Yeah, so the number of colors depends on your ability to separate them. So separation of colors usually happens with a wavelength division multiplexor. And if you look at data centers today, there are a couple of—

John Koetsier: Like a prism? 

Nicholas Harris: It is. It literally is.

John Koetsier: [laughing] Okay!

Nicholas Harris: And you can do it with another device called a resonator. But there are some standards on this, there’s coarse wavelength division multiplexing and dense wavelength division multiplexing. In coarse, you have a lot more spacing in nanometers between the colors, and in dense, you have a lot less spacing.

So how many colors could you do? I think we can leverage the same exact laser infrastructure that people are using for telecom, and they are currently looking at 16 and 32 colors. It’s not ready yet, but I think that’s like a pretty reasonable spot to go. And that’s a heck of a lot of compute if you’ve got that many colors going on.

John Koetsier: Very very impressive. I think we’re almost running out of the great, great questions and comments here. I’m sure there’s other ones that we won’t see, but these are some of the later ones, they don’t have as many comments.

This is — I gotta love this, James Watson ‘counted how many questions were asked: 22 … 8 questions were yes/no, 20 were answered with yeah, so.’ So, you know, this is the internet. People have time on their hands to do things like that with questions and answers.

I just want to say thanks for sitting in for the first interview, that was a lot of fun and I did, indeed, think it was ‘very interesting.’ But I also appreciate this time … you’ve gone deeper, you’ve gone more technical in response to those questions. So we’ll see if people like that and what are the questions they have. As you release, of course, I want to know. I want to hear. I want to see it. I believe your original timetable when we talked a couple months ago was something coming to market later this year. A, am I correct? And B, are you on track for that? 

Nicholas Harris: Yeah, so we were targeting end of this year. Building chips is really hard … it’s going to be next year, hopefully middle of next year. So that’s where we’re headed. 

John Koetsier: Okay. Is this feature creep or is this just it’s really frickin hard?

Nicholas Harris: You know, it’s running a massive engineering organization and managing timelines and all these sorts of things. It’s pretty challenging stuff. If you look at timelines that are predicted by companies like NVIDIA and Intel, it’s constantly the case that they’re not able to nail the time. So it’s just hard stuff. 

John Koetsier: Yeah, it is hard stuff and we have read The Mythical Man-Month, of course. And so the infusion of capital doesn’t always help; adding more people doesn’t always help. In any case, Nicholas, thank you again for taking this time, do appreciate it. 

Nicholas Harris: Absolutely. Thanks, John.

5% of people make it down here

Made it all the way down here? Who are you?!? 🙂

The TechFirst with John Koetsier podcast is about tech that is changing the world, including wearable tech, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Twitter’s chief information security officer, and much more. Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.

Subscribe on your podcast platform of choice:

 


Want weekly updates? Of course you do …