Defining angle of the gradient using CAGradientLayer

blancos picture blancos · Nov 12, 2014 · Viewed 10.2k times · Source

I am trying to draw angular gradient using CaGradientLayer. I know angle can be defined using startPoint and endPoint. I can compute these points for some standard angles like 0, 90, 180, 360 etc. But I want formulate these points for arbitrary angle. I have tried computing it using some trigonometry, but didn't get any success. Can anyone give me any directions on how to compute these points for arbitrary angles?

Answer

Sarthak Sharma picture Sarthak Sharma · Jul 5, 2017

Swift 3

static func setGradient(view: UIView!,viewRadius: CGFloat!, color1: UIColor!, color2: UIColor!, angle: Double!, alphaValue: CGFloat!){
    let gradient = CAGradientLayer()
    
    gradient.frame =  CGRect(origin: CGPoint.zero, size: view.frame.size)
    
    gradient.colors = [color1.withAlphaComponent(alphaValue).cgColor, color2.withAlphaComponent(alphaValue).cgColor]
    let x: Double! = angle / 360.0
    let a = pow(sinf(Float(2.0 * M_PI * ((x + 0.75) / 2.0))),2.0);
    let b = pow(sinf(Float(2*M_PI*((x+0.0)/2))),2);
    let c = pow(sinf(Float(2*M_PI*((x+0.25)/2))),2);
    let d = pow(sinf(Float(2*M_PI*((x+0.5)/2))),2);
    
    gradient.endPoint = CGPoint(x: CGFloat(c),y: CGFloat(d))
    gradient.startPoint = CGPoint(x: CGFloat(a),y:CGFloat(b))
    
    view.roundCorners([.topLeft, .bottomLeft], radius: viewRadius)
    view.layer.insertSublayer(gradient, at: 0)
    
    
}