T O P

  • By -

belabacsijolvan

as standards for calling something "ai" fall, an sql query will count as "ai" soon. i read a paper 2-3 years ago where a method that mathematically simplified to linear regression with sme was called ai. so we cant be far


DonutConfident7733

WHERE ... LIKE ... wow, check out this AI search...


geekusprimus

>i read a paper 2-3 years ago where a method that mathematically simplified to linear regression with sme was called ai. so we cant be far Technically speaking, that *is* what most people are calling AI these days. If you take a neural network and use a linear activation function, all the layers collapse into a single matrix. If you use the usual mean squared error as your loss function, it becomes a least-squares linear regression. The process of backpropagation and gradient descent reduce to an iterative method for solving a linear system; you could achieve the same result by explicitly calculating the inverse for the least-squares system or row reduction. All that AI does is take a dataset and find an optimal functional fit based on the desired properties. In other words, the difference between modern AI/deep-learning methods based on neural networks and a nonlinear statistical regression is marketing.


RamblingSimian

Relevant XKCD: https://www.explainxkcd.com/wiki/index.php/1838:_Machine_Learning "Just stir the pile until they start looking right"


Material-Mess-9886

Exactly. Same for Logistic regression is now AI (which is a single layer perceptron) and was previously machine learning and it all started as a generalized linear model.


belabacsijolvan

not really. noone uses only linear activation who knows what they are doing. its like saying "smartphones are just nutcrackers, because if you hit a nut real hard with them they crack it all the same"


geekusprimus

Nutcrackers and smartphones aren't designed to do the same thing. Linear regression and neural networks are. The difference is that linear regression requires your functional form to be a linear combination of the basis functions, and neural networks (and nonlinear regression in general) do not.


belabacsijolvan

we agree, exactly my point.


geekusprimus

I don't think we do agree. You're arguing that it's not fair to compare neural networks and AI to linear regression. I'm saying that it is; it's doing pretty much the same thing, just with fewer restrictions.


belabacsijolvan

>Nutcrackers and smartphones aren't designed to do the same thing this is the point. im not arguing its unfair to compare them. id say its extremely important to compare them, thats how we know that making a fully linear NN is a waste of time compared to analytical approach. im saying linear regression is not AI. the differences in restrictions between them are not negligible. its the difference between the economic system and a simple pendulum. they both have similar basic equations, but through nonlinearity, chaos and emergent properties one gains qualitative differences. No wonder we have words like "economics", "quantum informatics" or "condensed matter". You should know the harmonic oscillator to study any of them. Their solutions contain the math behind a simple pendulum again and again. Yet harmonic oscillator is not economics, its not quantum informatics, its not condensed matter physics.


geekusprimus

>im saying linear regression is not AI. the differences in restrictions between them are not negligible. its the difference between the economic system and a simple pendulum. they both have similar basic equations, but through nonlinearity, chaos and emergent properties one gains qualitative differences. Whether or not linear regression qualifies as AI depends on your definition. If you define it as a computer system capable of learning and adapting to its situation to optimize some outcome, anything using linear regression most certainly *is* AI, as is any sort of optimization algorithm, linear or otherwise. On the other hand, if your definition of AI requires some sort ability to generate new things, then I'm not really sure most neural networks fit, either, because I would describe most "generative" AI as sophisticated interpolators more than anything else. >No wonder we have words like "economics", "quantum informatics" or "condensed matter". You should know the harmonic oscillator to study any of them. Their solutions contain the math behind a simple pendulum again and again. Yet harmonic oscillator is not economics, its not quantum informatics, its not condensed matter physics. I don't think this is an appropriate comparison, either. A harmonic oscillator is a mathematical tool. Economics, quantum information, and condensed matter physics are all applications. Neural networks and linear regression are *both* function approximators. They're *both* statistical and mathematical tools. It's the difference between a hand whisk and a full stand mixer with ten different whisk and beater attachments. It might be easier to bake a cake or some brownies using the stand mixer with a specialized beater, but you could also do it with a stiff hand whisk and a little bit of elbow grease. I'm not disagreeing that neural networks can do more than linear regression. But I think showing that they're just a sophisticated statistical fitting tool that reduces to linear regression under the right circumstances is important both for understanding the intended application and use case and cutting through all the buzzwords.


Kovab

No one calls linear regression AI, and linear activation functions are utterly useless for neural networks exactly because that just collapses into a single matrix.


geekusprimus

And yet, for many applications, linear regression would be equally effective and far less expensive than training a complicated neural network.


czPsweIxbYk4U9N36TSE

> a method that mathematically simplified to linear regression with sme was called ai. High-dimensional linear regression does have ML properties.


belabacsijolvan

are you serious? are we already at the point where "doing something with a matrix" is "ML properties" is "AI" ? or what properties are you talking about? that you calculate a number from multiple numbers? is there any statistics thats has no "ML properties" iyo? i honestly tried to come up with a more absurd example of linguistic inflation, but im kinda cornered here. holy hell


RajjSinghh

In my second year of university we had a module called "AI". One submodule worth 50% was called "machine learning" and had algorithms like linear regression, SVMs, KNN and a ton of the other scikit-learn tools. You can justify linear regression as machine learning because you start at something bad, give it data and describe an algorithm to make it better. Sure it's not a neural network, but deep learning is only a subset of machine learning. The main point to remember is linear regression is functional approximation of a dataset. That is the same principle a neural network, the model is just different. You've got to remember how broad AI is. In the most general sense AI is just an algorithm that that can achieve a pre-defined goal using learning or intelligence. By a lot of definitions the ghosts in pacman running A\* search count as AI because the goal is chasing pacman and the aim and the intelligence is using that algorithm to meet that aim. People say Deep Blue (the chess engine that beat Kasparov) was AI, but really it was just a fancy calculator doing minimax. Statistical methods like linear regression count as AI. My point is that if your definition of an AI is ChatGPT or an LLM you're being needlessly restrictive. It's not "linear regression is AI" being linguistic inflation, it's "all AI is LLMs" being restrictive.


czPsweIxbYk4U9N36TSE

> Sure it's not a neural network, Actually it is. A linear regression is identical to a 1-layer neural network.


belabacsijolvan

so you say any functional approximation of a dataset is AI. Is average, modus or maximum AI? They are a constant functions, but just one order worse approximations than a linear one... i consider myself an AI dev, but i dont work with LLMs. i work with time series and multimedia. I think the usage of "AI" should be meaningful tho. Its a relatively new phenomenon that obviously includes LLMs and to stay meaningful it shouldnt include simple analytic statistics.


frehn

Look at it from another way: if you want to explain to someone how modern AI works, you have to explain neural networks. To get there, it's good to start to explain linear regression, since it's simple, but shares important concepts: functional approximation, parameters, training the parameters on data using a loss function, overfitting etc. So if you want to explain to someone how AI works, it's best to consider linear regression as a very simple form of AI. OTOH, if your interest in AI is more philosophical - Turing Test, Chinese Room, meaning of AGI - then probably you won't consider linear regression as AI, since it does not really seem to contribute much to these aspects.


belabacsijolvan

yes, there is this duality. but just because you've probably learnt vectors through geometry, you dont say vectors are geometry (ok this is a programming sub, so maybe you do, but a mathematician wouldnt). i dont rely a lot on linear regression when teaching about AI. I approach more from the clustering + convolution direction. ofc i connect it to linear regression, but that is not the only good approach. reading this thread i start to have the contheo that colleges like to upsell statistics to make it sound interesting, so many freshly graduated colleagues just think linear regression is AI, because it was taught in a course renamed from "Basic statistics" to "Fundamentals of Artificial Intelligence" or sthg. all im saying is that there is a superset called "data analysis" and it has a subset "machine learning" which has a subset "AI", and there is a method called "linear regression" that is an element of "data analysis" but not an element of "machine learning". it is a useful distinction, because it keeps language entropy high. No need for an arguable synonym for "data analysis". But there is a need for a word for artificial intelligence, the new techniques approaching human-level capabilities processing info in a highly complex and black-box kind of way. Im already upset enough that neural network kinda became a synonym for a specific type of neural network and fields diverge a lot in what they mean about it.


Mehdi2277

That word to people in field is AGI. In research/academic context ai includes very basic methods. Linear regression is commonly studied in ml and considered a foundational technique. Yes it is very simple model but AI does not mean complex. Outside ml, older classical ai also includes expert systems/logic programming languages (prolog) which could be called just a lot of rules. ML interviews very regularly include questions about linear/logistic regression and then other more complex models. Pick most textbooks both recent and old and you will find simple techniques included as ai/ml.


belabacsijolvan

no it is not AGI. it is artificial (not general, but in a specific task close-to-human performance via black box approach) intelligence. We could shorten it to ANGBIASTCTHPVBBA, or just AI. thats funny, because ive been in the field for 18 years, in a research and academic context and until the most recentish hype-train "AI" was rarely used, certainly not for basic statistical methods. interviews also include introducing yourself. is introducing yourself AI?


Mehdi2277

The textbooks I learned from were ai a modern perspective and Murphy’s ml book. They both use very broad definitions. Other ml books also generally cover linear regression and then develop to more complex techniques or just study theory related to it. Close to human performance was never part of ai in sense I’ve studied and used at work. We still often treat basic rl environments like balancing pole problem as test grounds. Edit: Also as a side note I have seen basic methods like logistic regression called ml both academically and in corporate contexts. A lot of major recommender systems (snap/tiktok/Facebook/etc) still have models close to logistic regression in it’s basic form in some parts of system (retrieval usually).


12345623567

Numerical methods are not AI, and I'm not going to start calling Gauss-Newton "AI" any time soon. It devalues both pure mathematics and AI research to conflate the two. AI needs to produce emergent results that cannot be predicted a-priori.


literum

Linear regression Is AI the same way your neurons are your brain or bricks are a building. When you stack them up you get something interesting. By themselves not so great.


-nerdrage-

New response just dropped


czPsweIxbYk4U9N36TSE

Here I use literally nothing more than a linear regression classifier, and [a learning set of 50 handwritten numbers](https://imgur.com/O7uTSRt), and it [correctly predicts 48/50 other handwritten numbers](https://imgur.com/99ULMhR). That's 96% prediction success with a learning set size of *50*. Not 5,000. Not 5 million. **50**. 96% success rate. Is that enough to claim that it has "ML properties" for you? Code to produce the above images: #!/usr/bin/env python3 from sklearn import datasets from sklearn.svm import SVC import matplotlib.pyplot as plt def learning_predicting_data(): digits = datasets.load_digits() digits.images = digits.images[:100] N = len(digits.images) N_learning = int(N * 0.50) return { "learning_images": digits.images[:N_learning], "predict_images": digits.images[N_learning:], "learning_targets": digits.target[:N_learning], } def flatten_images(images): return tuple(image.flatten() for image in images) def fit_learning_data(learning_images, learning_targets): # This is a linear regression. No bullshit no anything. Just a straight # linear regression. You could re-write this with hand-coded math if you gaf. I don't. linear_fit = SVC(kernel="linear") linear_fit.fit(flatten_images(learning_images), learning_targets) return linear_fit def plot(image_set, label_set, title): def ceil_div(a, b): return -(a // -b) # Allegedly this works, cba to verify. N = len(image_set) n_cols = 10 n_rows = ceil_div(N, n_cols) fig, axes = plt.subplots(n_rows, n_cols, figsize=(n_cols/2, n_rows/2*1.5)) fig.suptitle(f"{title}") for ax in axes.flat: ax.axes.set_axis_off() for ax, image, label in zip(axes.flat, image_set, label_set): ax.imshow(image, cmap=plt.cm.gray_r) ax.set_title(f"{label}") # plt.show() plt.savefig(f"{title}.png") plt.close() def main(): dataset = learning_predicting_data() fit = fit_learning_data(dataset["learning_images"], dataset["learning_targets"]) predicted_targets = fit.predict(flatten_images(dataset["predict_images"])) plot(dataset["learning_images"], dataset["learning_targets"], "Learning Data") plot(dataset["predict_images"], predicted_targets, "Prediction Data") if __name__ == "__main__": main()


Material-Mess-9886

In the end AI is just a bunch of partial differentials and matrix multiplication.


belabacsijolvan

in the end you are just a bunch of concentration gradients and transmembrane proteins


czPsweIxbYk4U9N36TSE

https://stats.stackexchange.com/questions/268755/when-should-linear-regression-be-called-machine-learning


belabacsijolvan

and what part of the linked thread supports your statement? is it "84 You should rename your regression as 'machine learning' whenever you want to double the fees on your rate card." ?


shanem2ms

Okay, so for arguments sake, when would you classify something as having ML properties? Because a multi-tiered network with back propogation is just another fancy way of doing linear regression.


belabacsijolvan

>a multi-tiered network with back propogation is just another fancy way of doing linear regression. if no activation function or other nonlinear layer is used and the layers are just dense weighted sums of activity, yeah. but its kinda pointless to build such a network. its like saying: "your computer is a multiplication table, because int a=b\*c; just does the same thing" >Okay, so for arguments sake, when would you classify something as having ML properties? I mostly believe that usage defines such words, thats why i dont like absurd usage. but this is a bitch-ass answer, so ill try to give a distinction. Machine learning occurs when large numbers of problem-specific parameters are automatically optimised to make accurate predictions. So in some sense our dense-only NN is machine learning, its just a very wasteful optimisation technique compared to the more analytical approach. edit: "the large numbers" could more accurately defined by saying that the parameter space should be such that finding an optimum in it is non-trivial.


best-set-4545

Remember the time your friend and their grandparents wanted to integrate blockchain in their app. This would all feel like a deja-vu to anyone who was around in the industry a decade ago


DonutConfident7733

or integrating local software with cloud accounts and requiring online connection all the time... waait a minute, Microsoft, wtf are you doing?


12345623567

The children who loved The Phantom Menace are now adults, and the MBAs who read about "the cloud" are now VPs. Extrapolating from that, VR will be forced down our throat in 20 years, and it will look like Roblox.


Master-Broccoli5737

lets not forgot internet of things or the greening initiatives


Big-Hearing8482

And integrating with social media before that!


Mateorabi

No no. It’s fractals. Fractals will solve anything.


hijodegatos

“Ok, what do you want it to do?” “… I don’t know, we just need it”


YUNoCake

Marketing, man...if it sounds good, people buy it. It's the sad reality


ZackM_BI

Innovative startup >looks inside >ClosedAI api


precinct209

But market research says a prominent AI component is crucial for our distruptive blockchain based Dutch tulip recognition app startup looking for funding.


Revolutionary_Rub530

I understood that reference.


Nitro5Rigger

With rising competition among IT industries, it is formidable. I hope someone will stop it.


ignoringusernames

yeah we need to keep the cost up.


seedless0

Artificial Incompetence.


zoqfotpik

Just switch everything to Consolas font. Done!


fatrobin72

Have we got AI in our keyboards to guess what key we just pressed yet?


BrownShoesGreenCoat

“Inject” AI!


Goat1416

"What ?"


Powerful_Cost_4656

I fucking hate how the acronym A.I. has been completely enshitified in less than a year


Ryzen_bolt

Meta AI to Whatsapp users, say no more!


qqqrrrs_

I want to integrate Ai over the interval \[0,1\]


walrus_destroyer

Why does the weather app I use have an AI assistant


notislant

The funniest thing ive seen recently is some company had 'ai' and part of it was just some outsourced indian dudes doing captchas manually lol.


Zephandrypus

At my job, every project has AI involvement by default, so we don't even have to ask, we just have to decide how many different models we want to use.


PhysicsNotFiction

As an AI engineer I don't mind


Baybam1

What the fuck is an ai engineer


teamswiftie

Someone who asks a lot of questions and copy/pastes


MohSilas

Some who spends half of their career labeling bananas


jameson71

Replaces if/thens with black box decision making