一尘不染

如何使用AccessibilityService在Android上执行拖动(基于X,Y鼠标坐标)?

java

我想知道如何在基于X,Y鼠标坐标的android上执行拖动吗?
考虑两个简单的示例,Team Viewer / QuickSupport 分别
在远程智能手机和Windows Paint Pen上绘制“密码模式”

我所能做的就是模拟触摸(使用dispatchGesture()和也AccessibilityNodeInfo.ACTION_CLICK)。

我找到了这些相关链接,但不知道它们是否有用:

使用AccessibilityService在屏幕上执行滑动例子1
继续手势下面是我的工作代码,该代码用于将鼠标坐标(在PictureBox控件内部)发送到远程电话并模拟触摸。

Windows窗体应用程序:

private void pictureBox1_MouseDown(object sender, MouseEventArgs e)
{
    foreach (ListViewItem item in lvConnections.SelectedItems)
    {
        // Remote screen resolution
        string[] tokens = item.SubItems[5].Text.Split('x'); // Ex: 1080x1920

        int xClick = (e.X * int.Parse(tokens[0].ToString())) / (pictureBox1.Size.Width);
        int yClick = (e.Y * int.Parse(tokens[1].ToString())) / (pictureBox1.Size.Height);

        Client client = (Client)item.Tag;

        if (e.Button == MouseButtons.Left)
            client.sock.Send(Encoding.UTF8.GetBytes("TOUCH" + xClick + "<|>" + yClick + Environment.NewLine));
    }
}

编辑:

我最后的尝试是分别使用鼠标坐标(“ C#Windows窗体
应用程序”)和自定义android例程(参考
上面链接的“滑动屏幕”的代码)进行“滑动屏幕”:

private Point mdownPoint = new Point();

private void pictureBox1_MouseDown(object sender, MouseEventArgs e)
{
    foreach (ListViewItem item in lvConnections.SelectedItems)
    {
        // Remote screen resolution
        string[] tokens = item.SubItems[5].Text.Split('x'); // Ex: 1080x1920

        Client client = (Client)item.Tag;

        if (e.Button == MouseButtons.Left)
        {
            xClick = (e.X * int.Parse(tokens[0].ToString())) / (pictureBox1.Size.Width); 
            yClick = (e.Y * int.Parse(tokens[1].ToString())) / (pictureBox1.Size.Height);

            // Saving start position:

            mdownPoint.X = xClick; 
            mdownPoint.Y = yClick;

            client.sock.Send(Encoding.UTF8.GetBytes("TOUCH" + xClick + "<|>" + yClick + Environment.NewLine));
        }
    }
}

private void PictureBox1_MouseMove(object sender, MouseEventArgs e)
{
    foreach (ListViewItem item in lvConnections.SelectedItems)
    {
        // Remote screen resolution
        string[] tokens = item.SubItems[5].Text.Split('x'); // Ex: 1080x1920

        Client client = (Client)item.Tag;

        if (e.Button == MouseButtons.Left)
        {
            xClick = (e.X * int.Parse(tokens[0].ToString())) / (pictureBox1.Size.Width);
            yClick = (e.Y * int.Parse(tokens[1].ToString())) / (pictureBox1.Size.Height);

            client.sock.Send(Encoding.UTF8.GetBytes("MOUSESWIPESCREEN" + mdownPoint.X + "<|>" + mdownPoint.Y + "<|>" + xClick + "<|>" + yClick + Environment.NewLine));
        }
    }
}

android AccessibilityService :

public void Swipe(int x1, int y1, int x2, int y2, int time) {

if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.N) {
    System.out.println(" ======= Swipe =======");

    GestureDescription.Builder gestureBuilder = new GestureDescription.Builder();
    Path path = new Path();
    path.moveTo(x1, y1);
    path.lineTo(x2, y2);

    gestureBuilder.addStroke(new GestureDescription.StrokeDescription(path, 100, time));
    dispatchGesture(gestureBuilder.build(), new GestureResultCallback() {
        @Override
        public void onCompleted(GestureDescription gestureDescription) {
            System.out.println("SWIPE Gesture Completed :D");
            super.onCompleted(gestureDescription);
        }
    }, null);
}

}

会产生以下结果(例如,仍然无法绘制“图案
密码”,例如TeamViewer)。但是,就像下面的评论中所说的那样,我
认为使用类似的方法可以使用“继续
手势”来实现。
欢迎向这个方向提出任何建议。


Edit 2:

Definitely, the solution is continued
gestures

like said on previous Edit.

And below is a supposed fixed code that i found
here =>

android AccessibilityService:

// Simulates an L-shaped drag path: 200 pixels right, then 200 pixels down.
Path path = new Path();
path.moveTo(200,200);
path.lineTo(400,200);

final GestureDescription.StrokeDescription sd = new GestureDescription.StrokeDescription(path, 0, 500, true);

// The starting point of the second path must match
// the ending point of the first path.
Path path2 = new Path();
path2.moveTo(400,200);
path2.lineTo(400,400);

final GestureDescription.StrokeDescription sd2 = sd.continueStroke(path2, 0, 500, false); // 0.5 second

HongBaoService.mService.dispatchGesture(new GestureDescription.Builder().addStroke(sd).build(), new AccessibilityService.GestureResultCallback(){

@Override
public void onCompleted(GestureDescription gestureDescription){
super.onCompleted(gestureDescription);
HongBaoService.mService.dispatchGesture(new GestureDescription.Builder().addStroke(sd2).build(),null,null);
}

@Override
public void onCancelled(GestureDescription gestureDescription){
super.onCancelled(gestureDescription);
}
},null);

Then, my doubt is: how send correctly mouse coordinates for code above, of
the way that can perform drag to any direction?
Some idea?


然后,我的疑问是:如何正确发送上面代码的鼠标坐标,
以哪种方式可以向任意方向执行拖动?有想法吗

编辑3:

我发现了两个用于执行拖动的例程,但是它们使用的是
UiAutomation
+ injectInputEvent()。AFAIK,
事件注入仅在像
此处和
此处所述的系统应用程序中有效,我不希望
如此。

这是发现的例程:

公共布尔刷卡(int downX,int downY,int upX,int upY,int步骤,布尔拖动)
公共布尔刷卡(Point []段,int segmentSteps)
然后实现我的目标,我认为2RD例行更拨付给使用
具有表现出对码(逻辑下,不包括事件注)编辑
2和发送的所有点pictureBox1_MouseDown和pictureBox1_MouseMove
(C#Windows窗体应用程序)分别填写Point[]动态,并
在pictureBox1_MouseUp发送cmd执行例程并使用此数组
填充。如果您对第一个例程有想法,请告诉我:D。

如果在阅读完此编辑后您有可能的解决方案,
请在回答中告诉我,而我将尝试测试该想法。


阅读 340

收藏
2020-12-03

共1个答案

一尘不染

这是一个基于问题Edit 3的解决方案示例。

C#Windows Froms应用程序“ formMain.cs ”:

using System.Net.Sockets;

private List<Point> lstPoints;

private void pictureBox1_MouseDown(object sender, MouseEventArgs e) 
{
    if (e.Button == MouseButtons.Left)
    {
        lstPoints = new List<Point>();
        lstPoints.Add(new Point(e.X, e.Y));
    }
}

private void PictureBox1_MouseMove(object sender, MouseEventArgs e)
{
    if (e.Button == MouseButtons.Left)
    {
        lstPoints.Add(new Point(e.X, e.Y));
    }
}

private void PictureBox1_MouseUp(object sender, MouseEventArgs e)
{
    lstPoints.Add(new Point(e.X, e.Y));

    StringBuilder sb = new StringBuilder();

    foreach (Point obj in lstPoints)
    {
        sb.Append(Convert.ToString(obj) + ":");
    }

    serverSocket.Send("MDRAWEVENT" + sb.ToString() + Environment.NewLine);
}

android service ” SocketBackground.java “:

import java.net.Socket;

String xline;

while (clientSocket.isConnected()) {

    BufferedReader xreader = new BufferedReader(new InputStreamReader(clientSocket.getInputStream(), StandardCharsets.UTF_8));

    if (xreader.ready()) {

        while ((xline = xreader.readLine()) != null) {
                xline = xline.trim();

            if (xline != null && !xline.trim().isEmpty()) {

                if (xline.contains("MDRAWEVENT")) {

                    String coordinates = xline.replace("MDRAWEVENT", "");
                    String[] tokens = coordinates.split(Pattern.quote(":"));
                    Point[] moviments = new Point[tokens.length];

                    for (int i = 0; i < tokens.length; i++) {
                       String[] coordinates = tokens[i].replace("{", "").replace("}", "").split(",");

                       int x = Integer.parseInt(coordinates[0].split("=")[1]);
                       int y = Integer.parseInt(coordinates[1].split("=")[1]);

                       moviments[i] = new Point(x, y);
                    }

                    MyAccessibilityService.instance.mouseDraw(moviments, 2000);
                }
            }
        }
    }
}

android
AccessibilityService
MyAccessibilityService.java “:

public void mouseDraw(Point[] segments, int time) {
    if (android.os.Build.VERSION.SDK_INT >= Build.VERSION_CODES.N) {

        Path path = new Path();
        path.moveTo(segments[0].x, segments[0].y);

        for (int i = 1; i < segments.length; i++) {

            path.lineTo(segments[i].x, segments[i].y);

            GestureDescription.StrokeDescription sd = new GestureDescription.StrokeDescription(path, 0, time);

            dispatchGesture(new GestureDescription.Builder().addStroke(sd).build(), new AccessibilityService.GestureResultCallback() {

                @Override
                public void onCompleted(GestureDescription gestureDescription) {
                    super.onCompleted(gestureDescription);
                }

                @Override
                public void onCancelled(GestureDescription gestureDescription) {
                    super.onCancelled(gestureDescription);
                }
            }, null);
        }
    }
}
2020-12-03