반응형

머신러닝 엔지니어 인터뷰

 

Most of the questions below from https://brainstation.io/career-guides/machine-learning-engineer-interview-questions

 

Machine Learning Engineer Interview Questions | BrainStation®

Machine Learning Engineer interviews differ at every company. Some focus mostly on technical questions, others are interested in how you would fit into their team.

brainstation.io

 

 

 

What is the difference between supervised learning and unsupervised learning?

The biggest difference is that unsupervised learning does not require explicitly labeled data, while supervised learning does – before you can do a classification, you must label the data to train the model to classify data into the correct groups.

 

    • What are the different types of machine learning?
      • Supervised Learning, Unsupervised Learning, Reinforcement Learning
    • What is deep learning, and how does it contrast with other machine learning algorithms?
    • What are the differences between machine learning and deep learning?
    • What is the difference between artificial intelligence and machine learning?
      • Deep learning is a type of machine learning, which is a subset of artificial intelligence.
    • Explain the confusion matrix with respect to machine learning algorithms.
      • A Confusion matrix is an N x N matrix used for evaluating the performance of a classification model, where N is the number of target classes.
      • The matrix compares the actual target values with those predicted by the machine learning model.
      • the True Positive is the number of  ------
      • This gives us a holistic view of how well our classification model is performing and what kinds of errors it is making.
      • , False positive and that you can calculate things such as Precision and Recall from it.
    • What’s the trade-off between bias and variance?
      • Bias is the simplifying assumptions made by the model to make the target function easier to approximate.
      • Variance is the amount that the estimate of the target function will change given different training data.
      • Trade-off is tension between the error introduced by the bias and the variance.
      •  
      • wikipedia 

통계학 기계 학습 분야에서 말하는 편향-분산 트레이드오프(Bias-variance tradeoff) (또는 딜레마(dilemma))는 지도 학습 알고리즘이 트레이닝 셋의 범위를 넘어 지나치게 일반화 하는 것을 예방하기 위해 두 종류의 오차(편향, 분산)를 최소화 할 때 겪는 문제이다.

        • 편향은 학습 알고리즘에서 잘못된 가정을 했을 때 발생하는 오차이다. 높은 편향값은 알고리즘이 데이터의 특징과 결과물과의 적절한 관계를 놓치게 만드는 과소적합(underfitting) 문제를 발생 시킨다.
        • 분산은 트레이닝 셋에 내재된 작은 변동(fluctuation) 때문에 발생하는 오차이다. 높은 분산값은 큰 노이즈까지 모델링에 포함시키는 과적합(overfitting) 문제를 발생 시킨다.

편향-분산 분해는 학습 알고리즘의 기대 오차를 분석하는 한 가지 방법으로, 오차를 편향, 분산, 그리고 데이터 자체가 내재하고 있어 어떤 모델링으로도 줄일수 없는 오류의 합으로 본다. 편향-분산 트레이드 오프는 분류(classification), 회귀분석[1][2], 그리고 구조화된 출력 학습(structed output learning) 등 모든 형태의 지도 학습에 응용된다. 또한 사람의 학습에서 직관적 판단 오류(heuristics)의 효과성을 설명하기 위해 언급되기도 한다.

 

    • Explain the difference between L1 and L2 regularization.
      • The main intuitive difference between the L1 and L2 regularization is that L1 regularization tries to estimate the median of the data  / while the L2 regularization tries to estimate the mean of the data to avoid overfitting.
      • The model performs accurately on training data but fails to perform well on test data and also produces high error due to several factors such as collinearity, bias-variance impact and over modeling on train data.
      • For example, when the model learns signals as well as noises in the training data but couldn’t perform appropriately on new data upon which the model wasn’t trained, the condition/problem of overfitting takes place. 
      • Overfitting simply states that there is low error with respect to training dataset, and high error with respect to test datasets. 
      • Various methods can be adopted, for avoiding overfitting of models on training data, such as cross-validation sampling, reducing number of features, pruning, regularization and many more.
  • What’s your favorite algorithm, and can you explain it to me in less than a minute?
    • My favorite algorithm is Naive Bayes classification algorithm based on Bayes’ Theorem with an assumption of independence among predictors. In simple terms, a Naive Bayes classifier assumes that the presence of a particular feature in a class is unrelated to the presence of any other feature.
  • How is KNN different from k-means clustering?
    • K-means clustering represents an unsupervised algorithm, mainly used for clustering, while KNN is a supervised learning algorithm used for classification.
  • What is cross validation and what are different methods of using it?
    • Cross-validation, sometimes called rotation estimation or out-of-sample testing to assess how the results of a statistical analysis will generalize to an independent data set.
    • Cross-validation is a resampling method that uses different portions of the data to test and train a model on different iterations.
  • Explain how a ROC curve works.
    • The ROC curve shows the trade-off between sensitivity (or TPR) and specificity (1 – FPR). Classifiers that give curves closer to the top-left corner indicate a better performance. As a baseline, a random classifier is expected to give points lying along the diagonal (FPR = TPR). The closer the curve comes to the 45-degree diagonal of the ROC space, the less accurate the test.
    • The receiving operating characteristic is a measure of classifier performance. Using the proportion of positive data points that are correctly considered as positive and the proportion of negative data points that are mistakenly considered as positive.
  • What’s the difference between probability and likelihood?
    • Probability is used to finding the chance of occurrence of a particular situation, whereas Likelihood is used to generally maximizing the chances of a particular situation to occur.
  • What’s the difference between a generative and discriminative model?
    • In simple words, a discriminative model makes predictions on the unseen data based on conditional probability and can be used either for classification or regression problem statements.    On the contrary, a generative model focuses on the distribution of a dataset to return a probability for a given example.
  • How is a decision tree pruned?
  • How can you choose a classifier based on a training set size?
    • If the training set is small, high bias / low variance models (e.g. Naive Bayes) tend to perform better because they are less likely to overfit.
    • If the training set is large, low bias / high variance models (e.g. Logistic Regression) tend to perform better because they can reflect more complex relationships.
  • What methods for dimensionality reduction do you know and how do they compare with each other?
    • PCA(Principal Component Analysis) and High Correlation Filter are my favorite methods for dimensionality reduction.
    • PCA is one of the most common feature selection method. 
    • PCA is a technique which helps us in extracting a new set of variables from an existing large set of variables. These newly extracted variables are called Principal Components. 
    • A principal component is a linear combination of the original variables
    • Principal components are extracted in such a way that the first principal component explains maximum variance in the dataset
    • The second principal component tries to explain the remaining variance in the dataset and is uncorrelated to the first principal component
    • The third principal component tries to explain the variance which is not explained by the first two principal components and so on
    • High Correlation Filter
    • We can calculate the correlation between independent numerical variables that are numerical in nature. If the correlation coefficient crosses a certain threshold value, we can drop one of the variables
  • Define precision and recall.
  • What’s a Fourier transform?
  • What’s the difference between Type I and Type II error?
    •  
  • When should you use classification over regression?
    •  
  • How would you evaluate a logistic regression model?
    •  
  • What is Bayes’ Theorem? How is it useful in a machine learning context?
    • Bayes theorem provides a way to calculate the probability of a hypothesis based on its prior probability, the probabilities of observing various data given the hypothesis, and the observed data itself.
  • Describe a hash table.
    • Hash Table is a data structure which stores data in an associative manner. In a hash table, data is stored in an array format, where each data value has its own unique index value. Access of data becomes very fast if we know the index of the desired data.

 

 

 

반응형
반응형

Steve Jobs Commencement Speech at Stanford University

스티브 잡스가 스탬포드 대학 졸업식에서 연설한 내용입니다.
예전부터 좋은 내용이라는 것은 알고 있었지만 시간이 지날 수록 다시 생각해 보게 만드는 문구들이 많은 것 같습니다.
어떻게 인생을 살아야하는지 몰라서 방황하는 모든 사람에게 들려주고 싶은 좋은 연설 인것 같습니다.

I am honored to be with you today at your commencement from one of the finest universities in the world.

Truth be told, I never graduated from college.

And this is the closest I've ever gotten to college graduation.

Today I want to tell you three stories from my life.

That's it. No big deal. Just three stories.

=======================================================================

The first story is about connecting the dots.

I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit.

So why did I drop out?

It started before I was born.

My biological mother was a young, unwed graduate student, and she decided to put me up for adoption.

She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife.

Except that when I popped out they decided at the last minute that they really wanted a girl.

So my parents, who were on a waiting list, got a call in the middle of the night asking: "We've got an unexpected baby boy; do you want him?"

They said: "Of course."

My biological mother found out later that my mother had never graduated from college and that my father had never graduated from high school.

She refused to sign the final adoption papers.

She only relented a few months later when my parents promised that I would go to college.

This was the start in my life.

And 17 years later I did go to college.

But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents' savings were being spent on my college tuition.

After six months, I couldn't see the value in it.

I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out.

And here I was spending all of the money my parents had saved their entire life.

So I decided to drop out and trust that it would all work out OK.

It was pretty scary at that time, but looking back it was one of the best decisions I ever made.

The minute I dropped out I could stop taking the required classes that didn't interest me, and begin dropping in on the ones that looked far more interesting.

It wasn't all romantic.

I didn't have a dorm room, so I slept on the floor in friends' rooms, I returned coke bottles for the 5 cents deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it.

And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on.

Let me give you one example:

Reed College at that time offered perhaps the best calligraphy instruction in the country.

Throughout the campus, every poster, every label on every drawer, was beautifully hand-calligraphed.

Because I had dropped out and didn't have to take the normal classes,

I decided to take a calligraphy class to learn how to do this.

I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great.

It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating.

None of this had even a hope of any practical application in my life.

But ten years later, when we were designing the first Macintosh computer, it all came back to me.

And we designed it all into the Mac.

It was the first computer with beautiful typography.

If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts.

And since Windows just copied the Mac, it's likely that no personal computer would have them.

If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do.

Of course, it was impossible to connect the dots looking forward when I was in college.

But it was very, very clear looking backwards ten years later.

Again, you can't connect the dots looking forward; you can only connect them looking backwards.

So you have to trust that the dots will somehow connect in your future.

You have to trust in something - your gut, destiny, life, karma, whatever.

Because believing the dots will connect down the road, it gives you confidence to follow your heart; even when it leads you off the well-worn path.

and that will make all the difference.

============================================================

My second story is about love and loss.

I was lucky - I found what I loved to do early in life.

Woz and I started Apple in my parents' garage when I was 20.

We worked hard and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees.

We had just released our finest creation - the Macintosh - a year earlier, and I had just turned 30.

And then I got fired.

How can you get fired from a company you started?

Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well.

But then our visions of the future began to diverge and eventually we had a falling out.

When we did, our Board of Directors sided with him. So at 30 I was out.

And very publicly out.

What had been the focus of my entire adult life was gone, and it was devastating.

I really didn't know what to do for a few months.

I felt that I had let the previous generation of entrepreneurs down that I had dropped the baton as it was being passed to me.

I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly.

I was a very public failure, and I even thought about running away from the valley.

But something slowly began to dawn on me.

I still loved what I did.

The turn of events at Apple had not changed that one bit.

I had been rejected, but I was still in love.

And so I decided to start over.

I didn't see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me.

The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything.

It freed me to enter one of the most creative periods of my life.

During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife.

Pixar went on to create the world's first computer animated feature film, Toy Story, and is now the most successful animation studio in the world.

In a remarkable turn of events Apple brought NeXT, and I returned to Apple, and the technology we developed at NeXT is at the heart of Apple's current renaissance.

And Laurene and I have a wonderful family together.

I'm pretty sure none of this would have happened if I hadn't been fired from Apple.

It was awful-tasting medicine, but I guess tha patient needed it.

Sometimes life is gonna hits you in the head with a brick. Don't lose faith.

I'm convinced (that) the only thing that kept me going was that I loved what I did.

You've got to find what you love.

And that is as true for work as it is for your lovers.

Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believes is great work.

And the only way to do great work is to love what you do.

If you haven't found it yet, keep looking, and don't settle.

As with all matters of the heart, you'll know when you find it.

And, like any great relationship, it just gets better and better as the years roll on.

So keep looking, Don't settle.

===========================================

My third story is about death.

When I was 17, I read a quote that went something like:

"If you live each day as if it was your last, someday you'll most certainly be right."

It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself:

"If today were the last day of my life, would I want to do what I am about to do today?"

And whenever the answer has been "No" for too many days in a row, I know I need to change something.

Remembersing that all will be dead soon is the most important tool I've ever encountered to help me make the big choices in life.

Because almost everything - all external expectations, all prides, all fear of embarrassment or failure - these things just fall away in the face of death, leaving only what is truly important.

Remembering that you are going to die is best way I know to avoid the trap of thinking you have something to lose.

You are already naked. There is no reason not to follow your heat.

About a year ago I was diagnosed with cancer.

I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas.

I didn't even know what a pancreas was.

The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months.

My doctor advised me to go home and get my affairs in order, which is doctor's code for prepare to die.

It means to try and tell your kids everything (you thought you'd have the next 10 years to tell them) in just a few months.

It means to make sure everything is buttoned up so that it will be as easy as possible for your family.

It means to say your goodbyes.

I lived with that diagnosis all day.

Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor.

I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying.

Because it turned out to be a very rare form of pancreatic cancer that is curable with surgery.

I had the surgery and thankfully, I'm fine now.

This was the closest I've been to facing death and I hope it's the closest I get for a few more decades.

Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:

No one wants to die. Even people who want to go to heaven don't want to die to get there.

And yet death is the destination we all share.

No one has ever escaped it.

And that is as it should be, because death is very likely the single best invention of Life.

It is Life's change agent.

It clears out the old to make way for the new.

Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away.

Sorry to be so dramatic, but is is quite true.

Your time is limited, so don't waste it living someone else's life.

Don't be trapped by dogma - which is living with the results of other people's thinking.

Don't let the noise of others' opinions drown out your own inner voice.

And most important, have the courage to follow your heart and intuition.

They somehow already know what you truly want to become.

Everything else is secondary.

When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation.

It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch.

This was in the late 1960's, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras.

It was sort of like Google in paperback form, 35 years before Google came along:

It was idealistic, and overflowing with neat tools and great notions.

Stewart and his team put out several issues of The Whole Earth Catalog

and then when it had run its course, they put out a final issue.

It was the mid-1970s, and I was your age.

On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous.

Beneath it were the words: "Stay Hungry. Stay Foolish."

It was their farewell message as they signed off.

Stay Hungry. Stay Foolish. And I have always wished that for myself.

And now, as you graduate to begin a new, I wish that for you.

Stay Hungry Stay Foolish

Thank you all very much.

반응형
반응형

스프링 부트의 기본 디폴트 웹앱 서버 포트는 8080 번 이다.
이 포트 번호를 바꾸고 싶은 경우

src/main/resources 폴터 안에 있는 application.properties 파일의 내용을 아래와 같이 작성해주면 된다.

# server.port = 변경하고 싶은 포트번호
# 아래는 10041 번의 포트로 변경하는 경우
server.port = 10041
10041번으로 변경 예시

반응형
반응형

html 에서 버튼을 만들고 스크립트에서 이 버튼이 클릭됬을때 ajax call 하는 스크립트를 만들어서 실행했다.

콜도 잘되고 실제 내용도 잘 저장/ 동작하는데 ajax call 에서 잘 동작되었을때 실행되는 done 부분의 코드가 실행되지 않았다.

이유는......

ajax를 실행시키는 버튼의 type 이 "submit" 인 경우 done 부분이 실행되지 않음.

type을 "button"으로 바꾸어 주어 실행하면 doen 부분에 적어 놓은 코드가 잘 실행됨.

 

 

반응형

+ Recent posts