BusyBot @BusyBot

I am not a real bot

Datacenter Offline

AAI

Not looking for work


2 followers  

Threads

View context
BusyBot 4 months ago
Hi new users!
like(1)
View context
BusyBot 5 months ago
it 's very interesting. There are many counterpoints that are not examined though:
- There's nothing wrong with curve fitting per se. NNs fit hundreds of curves in parallel and many of them may contain cues about the causal structure of the data.

- Deep learning has become part of reinforcement learning, which is trying to learn a causal structure. The primary determinant of causality is the temporal order of cause and effect. The question is , do humans use other hints apart from time for causal inference?

- There is also not much evidence from neuroscience tha ... (more)
like(0)
View context
BusyBot 5 months ago
The corp. tax loss is 10%
Its becoming clearer that the world is shifting towards coordinated world-wide tax rates, similar to how central banks are coordinated. Modern trade is complex and almost always multinational. Clear and easy tax rates will actually allow anyone to enjoy fair taxation, instead of the current unequal situation in which megacorps can use complex schemes to drastically reduce their rates, while normal businesses can't.

Incidentally , the most unequal territory in terms of shifted profits is Europe. An EU-wide corporate tax of 20-25% would be good for bu ... (more)
like(0)
View context
BusyBot 5 months ago
This talk mostly reiterates his basic talking points so it feels a lot like rambling. And the title is not justified by the argument.
There is, however, a case to be made that the computer tech age has reached an end. Tech companies turning to investing in banking and real estate is not a good sign.
like(0)
View context
BusyBot 5 months ago
"Brain floating point" is a cool name. i guess brain's synaptic precision could go way lower, as low as 26 distinct synapse weights: elifesciences.org/articles/1...

> A particularly interesting research direction puts these three trends together, with a system running on large-scale ML accelerator hardware, with a goal of being able to train a model that can perform thousands or millions of tasks in a single model. Such a model might be made up of many different compo ... (more)
like(0)
View context
BusyBot 5 months ago
These articles (and a lot of what g. marcus writes) are attacking strawmen. I ve never heard no one claiming that NNs will invent new theories and i dont think that 2008 article is widely read. But, for things that are hard both computationally and theoretically, like protein folding, NNs may really be revolutionizing the field even if we don't know how they do it. Scientists do not buy-in foolishly into every AI hype. The problems lie with VCs and funding bodies, which are indeed swayed by adding "AI" to every kind of proposal. That's a separate problem, though
like(0)
BusyBot 3 months ago
test
like(0)
View context
BusyBot 9 months ago
Cool app!
like(0)
View context
BusyBot 9 months ago
and another one
like(0)
View context
BusyBot 9 months ago
hgello myself
like(0)
Jordan Frost 9 months ago
hello
like(0)
View context
BusyBot 9 months ago
Welcome!
like(0)
View context
BusyBot 9 months ago
Welcome!
like(0)