MOHAMMAD ASAD’s Post

View profile for MOHAMMAD ASAD, graphic

B tech computer science @2024, Maulana Azad National Urdu University, Data science II Machine learning II Artificial Intelligence II Python II Data analysis II Data visualization II Deep Learning II computer Vision

𝗠𝗮𝗰𝗵𝗶𝗻𝗲 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 𝗔𝗹𝗴𝗼𝗿𝗶𝘁𝗵𝗺𝘀 𝗧𝗶𝗺𝗲 𝗖𝗼𝗺𝗽𝗹𝗲𝘅𝗶𝘁𝘆 𝗟𝗶𝗻𝗲𝗮𝗿 𝗥𝗲𝗴𝗿𝗲𝘀𝘀𝗶𝗼𝗻 → Training Time Complexity: O(n * p) → Prediction Time Complexity: O(p) 🟢 Linear regression scales well with large datasets, where n is the number of data points and p is the number of features. 𝗟𝗼𝗴𝗶𝘀𝘁𝗶𝗰 𝗥𝗲𝗴𝗿𝗲𝘀𝘀𝗶𝗼𝗻 → Training Time Complexity: O(n * p * i) → Prediction Time Complexity: O(p) 🟢 Involves iterative updates (i = iterations), making it slower than linear regression but efficient for binary classification. 𝗞-𝗡𝗲𝗮𝗿𝗲𝘀𝘁 𝗡𝗲𝗶𝗴𝗵𝗯𝗼𝗿𝘀 (𝗞-𝗡𝗡) → Training Time Complexity: O(1) → Prediction Time Complexity: O(n * p) 🟠 Training is instant, but prediction time grows with dataset size n, as it calculates the distance to every point. 𝗦𝘂𝗽𝗽𝗼𝗿𝘁 𝗩𝗲𝗰𝘁𝗼𝗿 𝗠𝗮𝗰𝗵𝗶𝗻𝗲𝘀 (𝗦𝗩𝗠) → Training Time Complexity: O(n^2 * p) (or O(n^3) for kernels) → Prediction Time Complexity: O(s * p) (s = support vectors) 🔴 Computationally heavy for training, especially with kernels, but effective in high-dimensional spaces. 𝗗𝗲𝗰𝗶𝘀𝗶𝗼𝗻 𝗧𝗿𝗲𝗲𝘀 → Training Time Complexity: O(n * p * log(n)) → Prediction Time Complexity: O(log(n)) 🟢 Efficient for both training and prediction, well-suited for non-linear data. 𝗥𝗮𝗻𝗱𝗼𝗺 𝗙𝗼𝗿𝗲𝘀𝘁 → Training Time Complexity: O(k * n * p * log(n)) (k = number of trees) → Prediction Time Complexity: O(k * log(n)) 🟠 Scales better than decision trees by reducing overfitting, but increases complexity with added trees. 𝗡𝗮𝗶𝘃𝗲 𝗕𝗮𝘆𝗲𝘀 → Training Time Complexity: O(n * p) → Prediction Time Complexity: O(p) 🟢 Simple and very fast, both in training and prediction, making it ideal for high-dimensional datasets. 𝗡𝗲𝘂𝗿𝗮𝗹 𝗡𝗲𝘁𝘄𝗼𝗿𝗸𝘀 → Training Time Complexity: O(i * n * p) (i = iterations) → Prediction Time Complexity: O(p) 🔴 Highly dependent on the architecture and number of layers, with significant training times. → Note: Time complexities provide a general guideline; actual performance can vary based on data, implementation, and hardware.

  • No alternative text description for this image

To view or add a comment, sign in

Explore topics