Ep #4 - Danning from 0x Network/Matcha - Dex Aggregators and dex data

Journey into Crypto Data Science, what she works on today and we talk a whole bunch over what dex aggregators do and how you can measure their success.

Transcript

Boxer  0:01  

Welcome everybody to a new episode of the Weekly Wizard. Today, I'm joined by Danning from the OEX project or the UX network and Matcha. 

And, yeah, I'm really happy that you found the time. It's always great to get to talk to some wizards. Welcome to the show.

Danning  0:22  

Thank you for the invite, Boxer. Glad to be here. And thank you, everyone, for coming and joining.

Boxer  0:31  

I think it's always interesting to hear the stories of how people got into crypto. So, maybe if you could tell us about your first touch point? Like what notified you of crypto?

Danning  0:43  

Yeah, so I think I was really lucky that I was in school in the data science master program. And then, through the program, there was a course about learning Spark, Hadoop, all that big data stuff. 

And then I was like I need to find a chorus project with my project mate. And we were like Blockchain data is big data. So, maybe we should take a look at that. And I tried to apply some machine learning stuff and data science to it. 

One thing we were trying to do is to de-anonymize the te wallet addresses, which is still a problem right now. It's still a mess right now. So, we were looking into the data. And then Joe Lubin from Consensus was having a talk on our campus, and we came up with questions. 

He was super generous, saying I could connect you with our data science team in Consensus. And then that's how I ended up with an internship and then full-time, so that was the start of my journey.

Boxer  1:41  

Well, it's pretty interesting. Usually, people were trying to make a quick buck or something. That's usually the story. And you're just like, yeah, I was actually here for the tech.

Danning  1:52  

Yeah, exactly. But honestly, I think there's a nuance there. So, I was doing something data science. At that time, everyone was still like, oh, machine learning, AI is future. 

So, I was not a woke saying, I know that I realized the value of blockchain. When I entered it, I was totally like…just lucky to get into it. And then honestly, the first year, or maybe the first half year at Consensus when people asked me, “So what's the difference between Bitcoin and Ethereum?” 

I didn't know how to answer honestly. So, only gradually, after a few months, I almost like…after a year, and when I actually got into the topic of Dex, and then there were some topics like TCR, token curated registry. And then crypto kitty as well. So, and I realized it's fun. And it's also super deep, honestly.

Boxer  2:45  

So, I guess this was 2016. Then just from the…

Danning  2:50  

Yeah. 17.

Boxer  2:53  

All right. And so, back then, there was no deal. And then, probably there were very few other data providers. So, did you actually run a full node at Consensus and just crave that node? Or, what kind of solutions were you using?

Danning  3:07  

Yeah, so Boxer, this is an interesting topic. So, the team I joined is that data science thing at Consensus. That was literally a competitor. We were trying to do…actually, whatever. Nowadays, all the data providers trying to do…the team was called Le Theo. 

We have a great team of engineers. Most of them are based in Romania. My direct manager also taught me a lot. He is based in Germany. And we were trying to do Etherscan plus analytics plus potentially the graph and all kinds of data service. 

And we were running. Yeah, we were running nodes. Initially, the team was doing something called East Stat, which was a very useful tool for a lot of the early miners to monitor as well as know the situation and the network.

Boxer  4:03  

What ended up happening? Why are we doing podcasts today and not the earlier podcast?

Danning  4:11  

Yeah, so I think that the team had great technology. But I think that was a great lesson that we expanded too much in terms of the business strategy we're doing and everything. But also, it was kind of early, and back then, there was a bear market in 2018. 

So, it was really hard for a data team to generate revenue back then when units will have just started yet. It's a pain point that I see that, if you're a data team, your customers are potentially the DAP developers. 

And then if the DAP team is too small, then they haven't had the funds to purchase External Data services yet. If they are mature enough, they probably have a good solid hand of engineers to build the nodes themselves. 

So, that was a really…I always say…a dilemma for data service providers back then. But nowadays, it's totally the booming time for data, I think.

Boxer  5:11  

Yeah, yeah. No, there's so many different businesses that it's very self-evident that this is a service that people could use. 

Okay, that's interesting to get a glimpse into that history. Because I wasn't around back then. I was eating shit. I didn't care about data.

Danning  5:31  

I wish I have to shake coin.

Boxer  5:38  

So, yeah I think we haven't covered…what are you working on today? 

Danning  5:48  

Yes. So, I think my career focus, I had a bit of a shift in terms of what exact data I try to analyze or decode back into Consensus. And nowadays, in Zurich…back then….because it's also a very general industry…holiday all around…I would say the data I was analyzing back then were more industry-level research. 

So, back then, there was comparing how decentralized it is. Comparing Ethereum versus other chain stuff. And then all kinds of topics. And back then, there was one topic that really picked up attention for me. It's DEX. 

There was one article that I started in my analysis journey on DEX data. We looked into a few typical DEX protocol back then. Xerox was using our order book still. And then, I do some annual visualization of like, how are those DEX users and taxes connected to potentially overlap? 

And then, I realized that the DAP team nad the application team in this space are pretty much the frontier engine for actually pushing forward what's going on? And what's the direction for the next few months and next few years? 

And then, that was like…I want to join a DEX team. And then oh, there was an interesting thing also. Back then, with the data team, we develop a small like MVP called Where Should I Trade? 

Because that was a hackathon in Germany. So, it's thought D is like, Where Should I Trade? That was like…and then basically, it's like, people come to the website, they say they want to trade it USDC to W ETH. Then we can tell them oh, so the best price is on IDEX ether Delta. Or yeah, Khyber or something. 

And then nowadays, I realized it's an aggregator. Yeah, so okay. So, now at 0x. When I joined, there was one data scientist Kroger. He built up a few data infra and pipeline stuff. So, when I joined, he shifted to an engineer row. 

I was more maintaining the end data engineering pipeline, and then also covering upfront of reporting internally. I was also externally working with doing also a few other data providers. They would, for example, bid query or other blocks. See those data providers. They will have to source the volume of 0x or match. 

So, you have to work with them to help, decoding attribution stuff. So yeah, pretty much everything. But the last year and 2021, I'm sure it's like a crazy year for every team. So, our data team expanded from two people. Nowadays, it’s like six people. So, my responsibility is reduced to the more on-chain data part, I would say.

Boxer  8:56  

All right, can you think many people will not really be familiar with what the OX network is doing? So, maybe if you could give us a brief overview what that is.

Danning  9:06  

Yeah, I'm not sure if you're actually saying, Ox or 0x. But I'm joking. What's the correct name? Oh, 0x. No, yeah, I'm joking. So yeah, there has been the DEX space for a long time. And 2017, I would say, the team started with an order book model. 

So basically, it's whoever comes to the protocol. You can be somewhat submitting an order nowadays, it's like limit order. And so basically, you're making an order and then we will put the order smaller order into a pool. 

And then once there is a match, saying you're trading A to B, the other person trading B to A, that's a match. You get to take care of the order. So, a maker and taker and make an order. 

And so that was the order book model, and then, with the rise of AMM, including uniswap sushi swap, we gradually realized that our order book can be more efficient if it's integrating the liquidity for our AMM. 

Because it's pretty dynamically adjusting the price. And you can always find a a backup option from a pool. And so, Zurich started to do aggregator by comparing different sources of the liquidity on DEX. 

And then gradually, we developed this API or xAPI where you can access the swap endpoint and then basically say, I want to make a trade. And then we will get you the best price across different libGDX liquidities. 

So, that was, I guess, like two years ago. And then, later on, the team realized the big money in the trade fight like market makers, they can actually eat them more efficiently and sense the dynamics or the movement of the market because they're, they're actually actively arbitraging across different platforms. 

And they can provide even more competitive costs for the users compared to hmm, and then So nowadays, these are xAPI that is still like aggregator comparing prices, is including both our audiobook and also hmm, and also our request for quotes, kind of quoting from market maker. 

So yeah, that's the current model. And so there's API. It’s basically the main product for Xerox. But then the team realized. So, we work with all kinds of dabbling, including zeros I per Metamask, wallet, Coinbase wallet whenever they want to access the X liquidity, they just call our API. 

So, in a sense, this is where our customers like to be at the moment. And then the team realized that like we could manifest a…how to say…retail app that could better use API. And so, we built my chat. Like a year and a half ago, I would say, yeah, and then so matcha is a front end. 

And how does the front end operate like a website? You go to Machado XYZ and then you make a trade? And then we will get you the best price still through the xAPI? And also was the landscape changing? We went to multi-chain nowadays.

Boxer  12:31  

Yeah. Why is it called 0x? Network? What? What is the network aspect? They're like…do we actually do storage? Like kind of on-chain? Do you like to submit proofs on-chain? How does that whole thing work?

Danning  12:44  

Yeah, I think the team definitely thought about putting everything on-chain, but history proved that on-chain is not efficient enough for making the trade or was a good price. So, we are still…the pricing part is off-chain. 

In terms of the price comparison, we definitely fetch the price from on-chain from all those AMM pools. And then our back end is off-chain, but then the sediment layers are on-chain.

Boxer  13:16  

Okay, I see. I see. So yeah, that's interesting. So, if I go to AdWords, it looks good. If I go to MetaMask and MetaMask SwapTeacher, then I will literally just call the OSAP or MetaMask called the xAPI for me.

Danning  13:40  

Yeah, that is correct. But also, nowadays, the wallets they would want to get, they want to try their best to get the best price for users. So, they are not only calling our API, they're also calling one inch. So, it’s that they're literally an aggregator.

Boxer  13:58  

Well, it's interesting. And the same probably applies to what Zerion and Zapper are doing.

Danning  14:06  

I'm not sure that's specific to teams, but Coinbase Wallet is likely as well. And Klaus Schwab is for sure. Frank analysis protocol. So yeah. 

Boxer  14:19  

Okay. That's yeah. That's very insightful. I never really thought about it that way. Um, so. You are basically taking care of the on-chain data for everything…like most of the things that happen on-chain for the network. It's kind of a great statement.

Danning  14:40  

Yeah, I could say that also. So, we have an internal data pipeline that is fetching the on-chain data externally. We are also sometimes using some other data providers as well. It has different reasons. So, it's always a lot of overhead. Adam, for a DAP team, to like…how to say…pour the full copy of the blockchain data. It's giant, and it's a lot of maintainers. 

You guys know their data leaks. So, the internal data pipeline is only fetching the things that are relevant to the lyrics like the other transaction that their ex-exchange proxy sent. 

So, that is helpful for internal data, metrics, and stuff to the reporting of volume and reporting of the user number and stuff. But then Dune is super helpful because it's the coverage of the whole blockchain as well as so many other teams. They already do the attribution there.  

If we want to do market share comparison, or like user comparison, or if we want to check if our user overlaps with clients a lot or something, we'll be doing so. 

Boxer  15:52  

Yeah, yeah. Maybe we could actually jump into some of the dashboards that you made on the DEX aggregator market. I guess, I think I saw a dashboard there. And just get a few thoughts about how you actually work with these results?

Danning  16:13  

Yeah. Hopefully, I'll share the screen. So, I see…here's one dashboard. That's my personal account that is called DEX aggregator. There’s always been some discussion about how the data should be attributed. 

But let's dive into a general idea here. So, as we know, DEX aggregator is specific…how to say…widget on top of different liquidities. So, we are sourcing liquidity from all those DEX. And then seeing the teams that are doing similar things is the biggest one. I like one Ancient Power swap. 

And so, this is ethereal. Only where what we see is that you see over time…so this is a more for…how to say…history of all the DEX aggregator. One age was super early, and they started like 2019. 

And then after almost a year, Zurich started using the API service, providing an API service. And then later on, Paris was started. So nowadays, I would say, wanting to still like the dominant one that is over 50% of the market share in terms of volume, and as their exit, the second around 30 percent of market share. 

Yeah, so this is how the current landscape looks. There are a lot of…how to say…nuances here about the data. So, one thing I do add as a filter here in this dashboard is that I am excluding the wrapping volume here. 

As I said, I'm excluding all the like…so first of all….east to east. And that makes sense. Why does that make sense? But just for the generic purpose, I'm just including all of them. Specifically wrapping volume means to W ETH or our wrapping W to ETH. 

So, in our team, we were discussing…oh, turns out wrapping volume can take a large percentage in some other aggregators. We were…this doesn't really make sense to…how to say…demonstrate the, like the user needs because it's possibly…it's always accompanied by a trade. 

If the user wants to maybe swap web…how to say…wwan to swap ease to another token, and then they have to get onboarded through a wrapping volume. So yeah, that's one thing. There's another thing that is also control. Controversial, I would say there's some discussion that people are saying aggregators. We should like to exclude the hops, though. What is hop? 

Though nowadays, aggregators have so much complicated logic. We do a lot of complicated things. For example, multiple fills is one thing, which is pretty straightforward. It’s like when you want to make a big tray. Let's say you want to trade 1 million USD and see the bth. We will find maybe 500k liquidity from the bank or 500k liquidity from uniswap and make you two fails. 

So, when we come to fills, first of all, some people make a confused take that counted as two trades. So, that's multiple fields. That's one thing. The second thing is the hot thing. So, if you are trading a super illiquid pair, let's say what will be…for example, I don't know, I'm super…no one knew…DeFi token B. 

And then we found out, oh, the price is super bad or at least oral. There's no pool like that at all to trade between these pairs. We will find you something like a two W ETH and then W ETH to be. So, that will be a bridge ithat's a hop in between. 

And so when different aggregators settle this trade on-chain, they consider it differently in execution. What we do is we tend to not log a specific event to save some gas for the user. 

But then in this case, when I deploy when I decode the data. I will have to maybe go down to the uniswap event and grab that fail and then go to the other liquidity source to grab that fail. 

And then, in this case, I will be actually doubling the volume. So, that's one issue right now that’s still existing in the aggregator space. So yeah, that's the mult- hub that people are talking about.

Boxer  21:04  

Yeah, so you are indeed counting the volume.

Danning  21:08  

So, we are making sure we’re not duplicating for most cases. But there are still some cases where we're duplicating it. But I think it's also depending on how you're interpreting the data. 

So, if we're counting multi-hops, I think the volume does still represent the protocol’s execution capacity on-chain on the smart contracts routing capacity. But if you're counting that volume and saying that's the actual user requested needs, that's not true.

If you are trying to measure the user demand in terms of volume, then you wouldn't want to do the volume. So, I think it's also depending on how you interpret the data.

Boxer  21:54  

Yeah, yeah, as long as there's a consistent method for if you're using the same method for all the different aggregators. I guess you can reasonably make these comparisons. 

But yeah, that just shows when I started working with Blockchain data. I was like, hey, this should all be really clean. It just gets produced by computers. Like how cool is this? There are no input dishes and stuff. 

And then you actually get down into the weeds and you're like, holy shit, kid, there's a lot of edge cases. 

Danning  22:27  

So yeah, especially what we do is done right now in a Dex abstraction report. It's like we want to standardize the thing, but each team is executing a different and maybe innovative way, but then it's harder to standardize. Yeah, yeah.

Boxer  22:44  

Yeah, interesting. Yeah. And also it’s super interesting for our side. This is such a sought out, sought after product…sought level…like what is this word? 

The DEX aggregators…dude…because it enables you guys to not only look at your own data because I guess one interest like their own internal data plan purlins as well. 

But then, if you want to look at everything like banker and uniswap and use will be two-three sushi swap and then all of these super fringe exes as well. It’s not feasible to do this on your own. 

So, our product comes in very handy there. So yeah, it's super interesting to see the Dexter traits abstraction evolve with the help from all of these teams.

Danning  23:37  

Yeah, yeah. I want to say though one thing really interesting is that you think about that report that everyone's contributing to it said Dow was incentivizing it. 

Can we incentivize the mechanism because everyone wants to be well-represented but then everyone's trying to do governance on other people's, like, hey, nagging you in the PRS to fix this data? 

Boxer  24:03  

It's adult, yeah. It's a jointly managed repository which is basically…yeah, there are some wars being fought in there. It's pretty funny to read sometimes. So, can you open the DEX metrics dashboard of Frederick? 

In there, to put this into perspective, for those people who are unfamiliar with the deck space, the aggregator DEX share volume or the aggregator share of DEX volumes is actually 27%. 

So, it's actually a giant market. In the last, I guess, 7 days? 30 days? What is this? How much volume do you guys make?

Danning  24:49  

Oh, here we go. It's like a few billion.

Boxer  24:55  

Casualty. Yeah.

Danning  24:57  

Yeah. Yeah, 7 days. A few billion. So it was totally different from a year ago. I felt the past year it's been booming in terms of volume and level of users as well. 

One thing though, is that while ironic whenever the market crashed, it’s a good time for us to see the volume. It’s the peak time for the volume. So, it's always like picking up in the bloody market.

Boxer  25:25  

Yeah. Yeah, people need to get out of their positions, I guess. Yeah. Yeah, yeah. The chart on the left…could you comment on the colorfulness?

Danning  25:38  

Yeah. It's awesome. I think I'm sure also the color has been growing a lot over time. So yeah, you want me to comment on the how many different DEX there are?

Boxer  25:52  

No. Like what's your opinion on this? Is this actually a real innovation? Or is it just desoldered? Forex? I just like trying to grift?

Danning  26:04  

Oh, yeah, I mean, nowadays, here, it shows like 20ish something or even more. Some smaller teams, they're not yet working with them, or they didn't. They haven't figured out the last game. They have to go to the report yet. 

So yeah, I see uniswap dominating. I think sushi's volume has been going down. Recently, sushi has always been like number two or three. In terms of DEX…hmm. 

Curve has been under that attention. Mainly through the words and stuff. We'll also look into curve data, they are definitely having an advantage over the X Y equals to K design when it's a super…to say fluctuating market based on their current design. 

So yeah, those are the big names for DEX. Hmm. And then I think it's always one age or zero, falling to the biggest aggregator. Oh, yeah. But I think this chart is. I think this chart is only DEX, though.

Boxer  27:10  

Yeah. Yeah, it's not. Because then we would have a bunch of very duplicated volume if we would include DEX.

Danning  27:19  

Yeah, that makes one good thing. I think one thing that is interesting in aggregators. Data on data can also showcase the competitiveness of DEX that, when we are comparing the price, basically, we're comparing which DEX has better pricing in this case. 

So, if the liquidity for uniswap is sourced, more than that means Firebase has a better liquidity source. So, it's a fair comparison for the pricing for all the attacks. Hmm.

Boxer  27:49  

Yeah.Do you have data on where you're routing to? I think…one interesting data somewhere. So maybe, we can look at Amazon's dashboard if you don't mind. K 06 A. Yeah. You probably…are you logging this data internally? Probably right.

Danning  28:18  

0k 06 A? Oh, yeah. We're logging it internally.

Boxer  28:27  

I think he's doing it. Yeah. Yeah, one-inch exported volumes.

Danning  28:36  

Right. All right. So this is…

Boxer  28:38  

Three? Pretty much.

Danning  28:41  

Yeah, I think so. It's based on their design of the pool which were also specifically the one VPS pool creation that has been I would say enabling a lot of potentials there. One also interesting angle, though, some people may use it for analysis is when unit swaps liquid is sourced more. It's likely because the LP is not actively moving their liquidity according to the movement. 

So, it's possible the higher capital efficiencies at the cost of LPs impermanent loss because the price wasn't a holiday match toward what the market is showing as a middle market price and then basically the liquidity provider is giving out a very bad price for themselves. Things like that. And then there are so many arbitrage bots is eating their liquidity.

Boxer  29:43  

Is the only sort of surrogate… I mean, for the LPs…it should theoretically be a good thing if there are arbitrage bots, right? If the bots just trade into uniswap pools all the time? They get the fees from that.

Danning  29:56  

Yeah, they get the fee from them. But let's say they put into the liquidity for ETH to the USDC at a price of, I don't know, like $2,000 ETH. But then the market moved crazily to $3,000. And then they could have withdrawn the liquidity and maybe sold it to someone else with a better price compared to that. So, that was like impairment loss, I guess.

Boxer  30:27  

Yeah, yeah, the uni B three. Yeah, it's more of…it's a platform. If you go under personally and you try to make some swag,I have some…I know some people who are just… I'm like LPN and small cap shit coins. That's usually working out pretty well. 

But if you go into ETH USDC or something like that, you’ll get wrecked as if this is not your market. Making is not your job. It's no longer…like do it. But the innovation that this has brought forth is pretty incredible. If you look at how much liquidity, how little liquidity, or how much volume shared units, or v3 is taking, it's absolutely amazing.

Danning  31:17  

Yeah, actually, we do some research on it. I totally agree there. The one BPS pool or fiberglass pool is much more active than the larger BPS pool, and the capital efficiency couldn't be like something at five digits percent. Something like that. It was boosted into some crazy number. 

Yeah, but as you said, the USDC pool and WTF are definitely not for the retail…like super active and also the biggest market maker. I mean, it's knowing their name and address are labeled. 

So, market makers like Wintermute and stuff, they are actively market making not really mocking me. They are market making with aggregators. They're providing quotes with aggregators as well as eating as arbitrage bots on the like. They're trying to get both sides.

Boxer  32:11  

Yeah. Yeah, I need to get one of those people on here. I need to read like five books on market making before, I guess, to even be able to ask. So yeah, that's a pretty interesting dashboard. 

I'm actually surprised. Why is it that the one VPS pools…do not like on the chart on the right top that you're like? No, yeah, it's one above that. I think…why is there…where's the one BPS fools?

Danning  32:47  

The one BPS…it seems like the liquidity range is pretty off from the mark middle market. I think it's possible that it's being consumed too quickly. Yeah, it needs more to be covered.

Boxer  33:05  

Yeah. So yeah, to get back to this, the DEX aggregators are basically a good indicator of which mechanism is working best?

Danning  33:27  

I think so. And that will specifically pair to the pause they're providing. Am I met…was definitely showing up only on curve and stuff….

And also, there's another condition. Only if the aggregator is comparing the price fairly. Right? Without any prejudice towards any buyers and priortizing any. 

Boxer  33:55  

Has there been a debate around that?

Danning  33:59  

I think there has been a debate around that. But I would imagine if any aggregator came up with their native liquidity, or I don't know, their own users limit order pool, it's possible. It could be a design, but it's not.

Boxer  34:18  

Yeah, I see. Yeah, that would be kind of…yeah, you would need to communicate that clearly. So not something I've thought about before.

Danning  34:30  

Yeah. In that case, I'm good or bad. Good or can also double check.

Boxer  34:35  

Yeah, but the problem was especially meter masters. They swapped. They tried to switch fee. So, wherever you end up trading, you get really bad code anyways because just meter mass takes something like 0.5% of you. Something like that. Or even higher? I don't know.

Danning  34:55  

Yeah. Yeah, like 875. Yeah.

Boxer  34:59  

0.875

Danning  35:01  

Yeah. Is there a point? 0.875? 

Boxer  35:05  

Yeah, that's almost a percent. That's really bad. Yeah, I think we've covered the DEX aggregator market. So maybe, what's your favorite dashboard on Dune besides what you are personally working on at the moment?

Danning  35:24  

I see. My favorite one. DEX metrics is the one that I go to the most. And then I like that one a lot. What's your other favorite one? If I may take a look…the other one I work on you mean?

Boxer  35:45  

Not fan-girling around someone like…

Danning  35:51  

Honestly, I'm a fan of Anton. Because he's super quick. Yeah, updating all the data. Yeah, I tried to talk to him sometimes. I was like, hey, this volume might be needed a jack adjustment because it's also a lot of overlapping between aggregators. 

Because he also used to use our on-chain smart contracts to settle their trades. And then the query is using the Zurich state table that I built. And then we shifted at the table with version and stuff. 

And I was like, you will need to change to that table. And then, when I send the message, you're not his friend. You cannot talk to him. Yeah, a year later or something, he accepted my friend request. 

Yeah, I'm a failure. Recently, I've been seeing a lot more crazy weather stuff from the NFT space. Oh, I really liked the EIP Wi-Fi fi dash pro…now Michael…so sapling. He's creating and generating all those like parameters. Yeah, it was the features.

Boxer  37:07  

There…Michael is basically programming Dune. Okay. Sure. Don't give me an API. Let me build this myself. It's pretty amazing to see…I think he comes up in every episode because I'm his number one fan. 

Yeah, the innovation that he's bringing to our platform is like…it's madness. He's just putting us on a whole other level. And now what he's doing…have you seen…have you looked into the optimism extraction? 

Repple, what he's doing in there? He's basically like what we've done as a community for Ethereum. He's just doing on his own for Optimism. So that's like, what?

Danning  37:57  

Yes, yes, thanks to him. So, the other day, I was like, oh, Miko is committing every day so actively, and I looked into it. Basically, he was building out all the infra tables on optics and pulling in metadata token prices. And then, I was like, Oh, this is easy and liquid. There’s zero volume on Optimism like fairly quickly. Was he stable and no issue, so it's rigid? It's awesome.

Boxer  38:26  

Yeah, he's a…I've learned that Grand Wizard is a really bad word in the US…so he's not a Grand Wizard. But he's a great wizard. Yeah, how was the house the multi-chain? Like, move for the Oh extent but going?

Danning  38:47  

Yeah. Tim is a…I've got that dashboard. So yeah, Xerox has been trying to expand to multi-chain things last year since March 2021. The first move was financed smart chain. And there have been so many challenges and things we see as we move to multi-chain because there are a lot of the chains. They are much faster in terms of block production. 

So, for example, Binance and Polygon are both two or three seconds. Production. It’s challenging for the data, and I'm sure you guys. How often does cron job check the chain as well as a reorg issue? So as far as I know, I think Polygon is the one that could be the biggest headache. 

I was joking the other day when Polygon raised like crazy valuation. I was like, that should be distributed in part partially for all the teams that are building on Polygon for their data for a cost. Yeah, so you imagine…I think they have something like a three-hour finality time and then translated into over three solid blocks. 

And so, it's impossible to do the reorg and check in a regular way. So, the regular way we do could probably not be that smart. We just make sure that we will scrape the top tip of the chain.

But then, next time when we will remove the last 30 blocks or the last blocks that are potentially reorg, that's not possible to do with Polygon. So, you will always have a sanity check to check the blog had a transparent blog, parent hash, and stuff. 

It's a challenge in our…in fries also improving. Guess so yeah. That's one aspect about the chain. So, whenever 0x API is expanded to multi-chain, it's the logic that we will need to deploy the smart contracts on the new chain first. 

And then, we will do the API side to enable it to be able to actually access differentiate. And then, on the mature side, we will enable the front end. So, people can actually make the trade on different chains. 

But I would say a lot of chains, when we try to decide which chain to go to, we will do the research in terms of how many users there are, how many transactions are happening every day already, etc.

But it's also depending on how many liquidity sources there are. So, if you dissolve and you swap is not going there, or the large liquidity pool is not going there, then there is no meaning for aggregators. 

No, no price to aggregate there. It's going to be a bad price for users anyway. So, we will have to go after the DEX. Hmm. Usually, I will say logically. Yeah. And then now talking about Solana, it's like a chain. 

So, it's even more. There’s so much more to learn. And few data stuff that I learned is totally different contexts in terms of what we talk about over the metrics. Like, first of all, when you talk about nominal transactions, a lot of the data we are presenting the nominal transactions is including the invalid. 

There's like 80% of some data on Solana Beach or somewhere. It's showing that those are mostly the POS validator that is voting for confirming the blocks. So, when you talk about transactions, you want to be aware of what's the scale is that  you're actually talking about. 

And then also, there's another thing that’s the Solana token program. It's designed to be that when one wallet…they have a token holding. It's like some kind of address is generated. And then, if you have another token in this same wallet, there's another address array data associated with that. 

So, when you talk about the number of addresses, it's not the same concept as the number of wallet users. Even talking about the number of users and now even with the same number of wallets, I would say.

Boxer  43:14  

Yeah, the concept of Solana is just radically different from what Ethereum is doing with storing a global state. And in Solana, they have this account. I think they're called the account system. 

And it's just when I first looked into Solana data. I was like, oh, what is it? Yeah, we will all need the data analyst community. We will all have to learn this. I guess some of us will just be like, yeah, I don't care. I will stay EVM focused. 

Because frankly, I think that it's the VM space already getting so big that, for most of us, it's probably the better move to just be like, hey, I'm gonna specialize in this. 

And then there will be other people popping up, which can be the sonar specialists. I don't like that I kind of need to learn it because of this. If it wouldn't have been for that, I would probably have stayed an EVM specialist. 

Just because DVM is also already very complicated. And then, trying to learn all of the Solana stuff, yeah, it seems really hard. But maybe Solana data will actually be easier because what kind of problem are we are facing on the Ethereum blockchain? Oftentimes, it's that we are not…we can't access the storage stuff on Dune, right? 

And I think, in general, that's a very hard problem to solve. I think there's one team in the space who was trying to do this now, and I'm really excited. I'm actually like…yeah, I'm just excited to see that when it goes live. 

But yeah, it's a real problem, and maybe like the account system that Savannah is doing because the global stage is basically in these accounts, right? So, maybe it will be easier for us to tighten, but we'll have to see.

Danning  45:01  

Yeah, that makes total sense. I wouldn't imagine. I mean, as an analyst, though, I wouldn't want to…I want to get famous with the Solana dashboard. So, you wouldn't stop people from analyzing EBM, too. 

So, just like all those big VCs, we won't make it with Ethereum, so we're gonna use Solana to make it. Yeah.

Boxer  45:24  

Yeah, a new generation of Dune Wizards will be Solana Wizards. I mean, that could happen. They could. And yeah, we'll have to see how fast we can develop certain query meta and how easy it will be to fork different queries of different people. 

But I think most of the stuff that's happening on Solana is like…it's this one Dex anyways, which is called Radium. And is it called Rady? I don't know. There's one Dex, which just has a shit ton of different front ends. But the back end is always the same. 

So, I think 80% of the transactions are basically this one DEX, which just has these different front ends. So, I don't know. It doesn't sound too complicated if you think about it at first.

Danning  46:11  

I see. I see. So it sounds like a uniswap of Solana. I think of a name like Serum and Alcara.

Boxer  46:18  

Yeah, it could also be, sir. Yeah. See, I don't know much about Solana yet. Yeah, but what I was originally like…question that came to mind is like, so the network has this API already, right? 

So, are you looking at doing cross-chain stuff, eventually? Where somebody submits an order for 500 or 1000 in Ether? And then you are like, hey, we can actually make this trade cheaper on Polygon and bridge back to ETH? Is that something you guys are thinking about?

Danning  46:57  

Yeah. So yes, totally crushed it. This has been a big topic here as well for research. There could have been a different way to implement it also. In the scenario you just mentioned, right? It's possible the user may just want the token back in the Polygon network, or maybe, they still want the token back in Ethereum. 

But then, we found a better swap hop through another network. It could be quite crazy and complicated, but I think I'll just say, initially, we had some thinking of, “What if we don't do any extra infra kind of hustle, but just the market maker to market make through different chain?” 

So, they will do their rebalancing through a chain, cross the bridge themselves every day, or stuff. That will be like an automatic, economically incentivized widget to bridge the crusher and functionality. 

That's one thinking at the moment. Another thinking is basically use protocols like the cross-chain bridge, hop protocol, or any other infra. I think work is doing multiple at the moment to figure out what will be the best option. 

And then, whenever the user wants to trade a different chain, we basically do all the things like I'll just say bridge the asset and then do the swap there. Something like that. But then the recent thing was that wormhole which makes us really hard to say. 

I wouldn't say hesitating…but then we want to be more cautious for the step we take. So, maybe I think 0x has always been…honestly like we wouldn't want to be chased on the train to be the first one to do this here or there. 

I think we were more cautious and responsible for all the partners working with so it could be a…we may just observe the landscape and see what's developing and then decide so…

Boxer  48:57  

Yeah,the  the first strategy basically yeah, but there's so much complexity involved already. And then if you factor in three arcs…if those happen and you have done a multi-chain swap, then you're basically saying what are we gonna do? So, it's very complicated.

Danning  49:22  

Yeah, that is like with the finality time, and mathematically you wait for the longest maximum finality time. Yeah, so…

Boxer  49:34  

Yeah, need some good insurance.

Danning  49:43  

Are we just letting the jump capital invest in everyone? Yeah.

Boxer  49:48  

Yeah, yeah, reasonable, reasonable. I don't know if you've seen this, but like, you've probably heard the story of like, there's this couple who still all like 3.6 billion Bitcoin. And there was one scheme about how those FBI agents tracked the different wallets? Have you seen that?

Danning  50:11  

I haven't. But I would imagine it's some kind of mixing detection, like whoever traded with this address and then looking like a Sankey diagram. 

Boxer  50:22  

Oh yeah, it was basically a flow diagram. And Monero was somewhere in there. And I was actually, really surprised, but you don't happen to have insights into how to decode Manero. 

But just like Bitcoin and Monero tracing in general, is that something you're kind of familiar with? Probably not. Right.

Danning  50:44  

Not much. But yeah, back then, that D anonymize project was basically trying to do something like this. I'm not sure if this exact same though. Like I’m assuming they will have to translate their wallets, but they will do some mixing pattern. They just try transacting with some random irrelevant wallets as well. 

But you get connected components in a network graph, and then you can basically get all of them that is irrelevant. I don't know if that's the idea here.

Boxer  51:16  

Yeah. Yeah. Maybe you would like…I thought maybe you'd seen the flowchart. I'll send it to you after the podcast. It was pretty...I was just very surprised that they were able. They went through a few different mixes. They went through Mineiro and through 20 different wallets. 

And they were still able to be tracked. So, I just thought that would be like…it's just interesting like the state NGC seem to be getting pretty good. 

So, everyone out there who's safe? Who feels safe behind tornadoes? You should think twice? Yeah, yeah, I think this has been a very insightful conversation. Maybe if you could wish on a star like what? 

What feature would you like to see in Dune? Next, we've announced a few features. So, during our Series B announcement and like…yeah, other than that…what would be a feature that you would really like to see?

Danning  52:25  

Yeah, I think Dune has been super helpful doing the education part. We're hosting our network sequel course. Weliterally found our recent data scientists through that course. He joined us this week. 

He was super crazy. Remember in that course. Deep on Joe. Yeah. So, we're lucky. Oh, really? All you're doing is pushing the space? Yes. Yeah, he's joining us. So yeah, that's great. 

Yeah. I think Dune has all the stuff that's needed for a new newbie to enter the space as a data expert as the career goal. So, maybe we could do something like Knowledge Graph, or some kind of…how to say…I wouldn't imagine where you can go to Dune and then find out like…

Oh, these are all the best dashboards. NFT. And then these are all about DEX, and these are the things you need to know about Dow and stuff. And then curating more of that kind of onboarding, new add-ons kind of thing. That will be awesome. That'll be exciting. And see you guys can also develop, like Korean platform already.

Boxer  53:46  

It's actually amazing how many people have the 3,430 cores and have jobs now or a very successful news base. We were fired one as well. Or two if you count that we've given a freelancer grant out to switch in this morning. 

And we've hired someone else. So yeah, I think those people are really going places. It's really amazing to see. And yeah, I fully agree with your take on knowledge creation. I think what I'm doing could become more like Wikipedia style thing, where it's different. Like what's already kind of happening in the Dexter trade repository where like different people try to find truth. 

And that's something that I've very actively been thinking about for the last few weeks. And we'll definitely start trying to tackle that. And then there's also…I think…it needs to be a different kind of UI thing where we just curate certain dashboards. So yeah, I'll probably hit you up about that. 

Danning  55:06  

Sounds good. Sometime offline and also education. We've all seen this, and it will be my mission. I'm a professor.

I mean also the Doom Discord is Stack Overflow for the blockchain data gathering new and open tab for the forum. I imagine people would even like…I wouldn't say like making a living but, they’re trying to make it as…I don't know… a thing for answering stare MBO wizard there.

Boxer  55:29  

Yeah. Yeah, maybe we should just start to stack the overflow category. Yeah, I guess like that. Because it's the solution for most code stuff. That could actually make sense. I should actually check if there has been a question that has been unanswered for like six months. 

I don't know. Yeah, it could be. Because nobody ever checks. But that's surfacing all the knowledge that we are exchanging all the time. I think that's a big theme that we, as an organization, recognize that we're not doing a good enough job of that. 

So yeah, great. Interesting feedback. Thanks so much for coming on. I'm a big user of matcha myself. Although, recently, I really liked how Swab is, I hear that they're using sticks in the background as well. 

Yeah, thanks for all the contributions to exit rates. Keep rocking. It's been a pleasure to have you on. Have a nice day.

Danning  56:36  

Okay, you too. Thank you all for hanging here and have a nice day now. I'll come to Avila. We will…everyone who will cross paths in this space. 

Boxer  56:46  

Yeah, we'll see each other on there. Yes. Take care. Bye. Bye. Bye bye.